Assessing Test Coverage
A couple of weekends ago (I'm a little behind) we held the July session of the Indianapolis Workshop on Software Testing. The attendees were:
The topic we focused on for the five-hour workshop was assessing test coverage.
The workshop started with an experience report about managing coverage by Mike Goempel. Mike's experience focused on requirements based coverage with additional exploratory testing based on risk. He shared a spreadsheet he uses to manage the testing process and to help identify holes in coverage. He uses a completeness versus quality technique where completeness is measured in terms of his dashboard and a requirements traceability matrix and where quality is measured by test case reviews, defect counts, and tester interviews. If you have questions about the dashboard, Mike would be glad to answer them via email.
Next, Dana Spears (with some help by Susan Morgan) presented a wonderful experience she had using Mary Decker's risk assessment template. She talked about how she customized the template to allow her to look at a project in terms of risk and to organize her testing in terms of those risks. Coverage in her example was not based on code or requirements, but instead on areas of the application under test where the risks were the highest. She gave several rich examples along with some lessons learned.
In addition, Dana and Susan also briefly addressed some derivative work they did using the risk assessment template that they also presented at the Indianapolis Quality Assurance Association. They used the same template and customized it to help with determining tester to developer ratios.
Finally, I used Brian Marick's agile testing directions chart to talk about how I envision test coverage throughout the project. I like it and he said it better then I can, so I stole it (with proper credit of course). Using a whiteboard and capturing some of the things we had already talked about, I cataloged the coverage techniques we talked about (and some we hadn't) in terms of business facing or technology facing coverage and in terms of support for programming or critique of the product.
It has just occurred to me that I need to start taking a digital camera with me to the meetings.
Next month, maintaining testing skills. Email me if you would like to attend.
- Andrew Andrada
- Mike Goempel
- Michael Kelly
- Jeffrey Mexin
- Susan Morgan
- Dana Spears
- Kim Titus
The topic we focused on for the five-hour workshop was assessing test coverage.
The workshop started with an experience report about managing coverage by Mike Goempel. Mike's experience focused on requirements based coverage with additional exploratory testing based on risk. He shared a spreadsheet he uses to manage the testing process and to help identify holes in coverage. He uses a completeness versus quality technique where completeness is measured in terms of his dashboard and a requirements traceability matrix and where quality is measured by test case reviews, defect counts, and tester interviews. If you have questions about the dashboard, Mike would be glad to answer them via email.
Next, Dana Spears (with some help by Susan Morgan) presented a wonderful experience she had using Mary Decker's risk assessment template. She talked about how she customized the template to allow her to look at a project in terms of risk and to organize her testing in terms of those risks. Coverage in her example was not based on code or requirements, but instead on areas of the application under test where the risks were the highest. She gave several rich examples along with some lessons learned.
In addition, Dana and Susan also briefly addressed some derivative work they did using the risk assessment template that they also presented at the Indianapolis Quality Assurance Association. They used the same template and customized it to help with determining tester to developer ratios.
Finally, I used Brian Marick's agile testing directions chart to talk about how I envision test coverage throughout the project. I like it and he said it better then I can, so I stole it (with proper credit of course). Using a whiteboard and capturing some of the things we had already talked about, I cataloged the coverage techniques we talked about (and some we hadn't) in terms of business facing or technology facing coverage and in terms of support for programming or critique of the product.
It has just occurred to me that I need to start taking a digital camera with me to the meetings.
Next month, maintaining testing skills. Email me if you would like to attend.