Agile Testing

Last weekend we held the April session of the Indianapolis Workshops on Software Testing. The attendees were:


  • Joe Batt

  • David Holscher

  • Brian Marick

  • Kenn Petty

  • Charlie Audritsh

  • Kartik Gaddam

  • Rajasimba Admal

  • Dana Spears

  • Michael Kelly



The topic we focused on for the five-hour workshop was testing in an agile environment. About half of the attendees are currently working in an agile environment of some kind, and the other half were new to the topic and were interested in learning more. The following is a summary of presentations and ideas shared.

First to talk were David Holscher and Joe Batt. They wanted to preview their talk on Continuous Integration With CVSNT, CruiseControl, Ant, JUnit, JFCUnit and More on Microsoft Windows which is based on a talk they will be presenting later this year at the JavaOne Conference.

They have made their slides available, so I don't want to go to deep into the material, but their shared experience covered setting up continuous integration in a windows environment focusing on version control, build automation, automated testing and deployment, repeatability, and fault tolerance. They used a laundry list of tools to make it all happen (all open source I think), and they seem to be very pleased with the results.

All of the testing on the project was developer testing, they had no formal tester. They used JUnit for unit testing and TDD (which they did most of the time, but not all of the time) and JFCUnit for GUI level test automation. They relied (and still do I think) mostly on test scenarios derived from use cases and the test cases that result from TDD. One of the things noticed by Joe was the heavy focus on building testability into the system. Since it was the developers doing the testing (especially the GUI automation), a little more time was spent getting the design right and less changes occurred downstream.

Brian Marick recommended Mike Clark's book Pragmatic Project Automation for more information on the topic if you have questions after looking at their slides.

Following the continuous integration presentation, Brian Marick talked a little about a whole lot of things, ranging from Test First Programming to Test Driven Design (for the developers in the crowd) to the differences between "developer testing" and "tester testing" and where testing fits in agile methodologies in general. Brian, being the prolific blogger that he is, has most of his brain available online, so I can link to just about everything he talked about (and more):

Design-Driven Test-Driven Design:
http://www.testing.com/cgi-bin/blog/2005/03/17#presenter1
http://www.testing.com/cgi-bin/blog/2005/03/26#presenter2
http://www.testing.com/cgi-bin/blog/2005/03/30#presenter3
http://www.testing.com/cgi-bin/blog/2005/04/15#presenter4

Agile testing directions:
http://www.testing.com/cgi-bin/blog/2004/05/26#directions-toc

Brian of course mesmerized the workshop (myself included) but I had to cut him off so we could here the final experience report. I imagine we could have listened to him for at least another five hours! Check out the links above (especially the agile testing directions) and get a feel for where he's coming from. I'm certain Brain will answer any well thought out question on the material.

Following Brian's talk, Kenn Petty presented his experience implementing Session Based Test Management (SBTM) and pair testing in a rapid development environment. Kenn provided all the proper links and credits to where he harvested his ideas. He then jumped into what worked and what didn't.

Kenn's team recently finished their first iteration using pair testing and SBTM. Previously they followed (as best they could given time constraints) the IEEE 829 format for test documentation and the model that is commonly associated with the management and execution of those test plans and cases. Using pair testing meant that for most of the testing (90% maybe?) that takes place is sessions, there has to be two people: they can be two testers (this is happened a lot) or a tester and a developer (this happened some).


  • Collaboration and mutual respect increased between the testing and development teams.

  • The testers found more meaningful bugs faster.

  • Overall bug counts increased as did the average severity of defects found.

  • Because collaboration improved, the number of defect tickets marked "Functions as designed" went from 70 on the previous iteration down to 3.

  • The testing team became more motivated because they focus shifted from documentation to finding bugs and collaborating with development. They felt like they were learning new skills from each other and they felt more challenged.



Testing took place in sessions, as outlined in the Satisfice material on the topic, and at the end of each session testers were debriefed by Kenn. Charters were initially generated by Kenn, and then during the execution of a charter, the testers could identify spin-off charters on their own, or during debriefing Kenn and the testers could identify new charters. Each requirement change (tracked mostly by email) generated a charter of its own.

Overall I thought it was the best workshop so far. I think I have said that each time, but I mean it each time. I would really like to thank Brian, Joe, and David for making the trip to Indianapolis for a five hour workshop. Brian didn't only come for the workshop, he also came to see the dinosaurs in downtown Indy. I don't know if the workshop was better then the Gorgosaurus (I've seen him and he's quite impressive), but I hope everyone had fun.