Posts in IWST
Assessing Test Coverage
A couple of weekends ago (I'm a little behind) we held the July session of the Indianapolis Workshop on Software Testing. The attendees were:


  • Andrew Andrada

  • Mike Goempel

  • Michael Kelly

  • Jeffrey Mexin

  • Susan Morgan

  • Dana Spears

  • Kim Titus



The topic we focused on for the five-hour workshop was assessing test coverage.



The workshop started with an experience report about managing coverage by Mike Goempel. Mike's experience focused on requirements based coverage with additional exploratory testing based on risk. He shared a spreadsheet he uses to manage the testing process and to help identify holes in coverage. He uses a completeness versus quality technique where completeness is measured in terms of his dashboard and a requirements traceability matrix and where quality is measured by test case reviews, defect counts, and tester interviews. If you have questions about the dashboard, Mike would be glad to answer them via email.

Next, Dana Spears (with some help by Susan Morgan) presented a wonderful experience she had using Mary Decker's risk assessment template. She talked about how she customized the template to allow her to look at a project in terms of risk and to organize her testing in terms of those risks. Coverage in her example was not based on code or requirements, but instead on areas of the application under test where the risks were the highest. She gave several rich examples along with some lessons learned.

In addition, Dana and Susan also briefly addressed some derivative work they did using the risk assessment template that they also presented at the Indianapolis Quality Assurance Association. They used the same template and customized it to help with determining tester to developer ratios.

Finally, I used Brian Marick's agile testing directions chart to talk about how I envision test coverage throughout the project. I like it and he said it better then I can, so I stole it (with proper credit of course). Using a whiteboard and capturing some of the things we had already talked about, I cataloged the coverage techniques we talked about (and some we hadn't) in terms of business facing or technology facing coverage and in terms of support for programming or critique of the product.

It has just occurred to me that I need to start taking a digital camera with me to the meetings.

Next month, maintaining testing skills. Email me if you would like to attend.
Open Source Testing Tools
Last weekend we held the June session of the Indianapolis Workshops on Software Testing. The attendees were:

  • Taher Attari

  • Charlie Audritsh

  • Mike Goempel

  • Michael Kelly

  • Marc Labranche

  • Jeffrey Mexin

  • Patrick Milligan

  • Richard Moffatt

  • Dana Spears

  • Jon Strayer


The topic we focused on for the five-hour workshop was open source testing tools.



I would like to relate some deep insight, but tools being what they are we didn't get a lot of discussion. Five tools were presented by Charlie Audritsh, Mike Goempel, and myself. In general, it seemed to generate interest in those tools and there were targeted questions about applying the tools, but the discussion level was relatively low. Rather then doing a play-by-play of this months meeting I thought I would just share the tools covered and let you do your own research. The tool creators say it better then me anyway.

Here are the tools we looked at:

Watir
WATIR stands for "Web Application Testing in Ruby". Watir (pronounced water) is a free, open-source functional testing tool for automating browser-based tests of web applications. Watir drives the Internet Explorer browser the same way an end user would. It clicks links, fills in forms, and presses buttons. Watir also checks results, such as whether expected text appears on the page. Watir is a Ruby library that works with Internet Explorer on Windows.

Web Application Stress Tool
The Microsoft WAS web stress tool is designed to realistically simulate multiple browsers requesting pages from a web site. You can use this tool to gather performance and stability information about your web application. This tool simulates a large number of requests with a relatively small number of client machines. The goal is to create an environment that is as close to production as possible so that you can find and eliminate problems in the web application prior to deployment.

A visual cognition video
This video was part of an experience report given by Mike Goempel. Mike was shown this video at STMR 10 by Cem Kaner and has used it as an example to show project managers some of the drawbacks of scripted manual testing. Cem also shows and explains this video in detail in his course on Black Box Software Testing. Look under "test procedures and scripts" I think...

Firefox Web Developer Extension
Adds a menu and a toolbar with various web developer tools that are useful for helping with test automation, security testing, usability testing, and functional testing.

WebGoat
WebGoat is a full J2EE web application designed to teach web application security lessons. In each lesson, users must demonstrate their understanding by exploiting a real vulnerability on the local system. The system is even clever enough to provide hints and show the user cookies, parameters and the underlying Java code if they choose. Examples of lessons include SQL injection to a fake credit card database, where the user creates the attack and steals the credit card numbers.

Next month we look at assessing test coverage. I'm saving my energy to blog about that. I think we will generate a lot of interesting and useful content at that meeting. If you would like to attend, let me know.