Posts in IWST
Effective corrective action for test projects in trouble
This weekend we held the January session of the Indianapolis Workshop on Software Testing. The attendees were:


  • Andrew Andrada

  • Charlie Audritsh

  • Lisa Etherington

  • Michael Goempel

  • Michael Kelly

  • Baher Malek

  • John McConda

  • Kenn Petty

  • Vishal Pujary



The topic we focused on for the five-hour workshop was effective corrective action for test projects in trouble.



The workshop started with me sharing an experience from a recent project where I both created a troubled test project and how I got myself out of it. I talked about some of the reasons why we became troubled, and focused on the two things that really helped us get clarity in what we were doing and relieved some of the pressure:


  1. We added the right testers at the right time.

  2. We added visibility to the progress of testing by switching to daily milestones.



Adding the right testers at the right time, means we didn't add testers when the project manager wanted to, but we waited for when we (the test team) felt like they would be most effective. We wanted our testing to be far enough along that we knew the testers we added would be able to be effective by being able to work with stable software in a stable environment. In addition, the testers we added were not new to the software. We were able to borrow testers from other project teams for short periods of time.

When we feel behind schedule, we added visibility to the progress of testing by switching to daily milestones. We talked in terms of features tested, and not in terms of numbers of tests executed. We wanted to be clear about what the status meant, and what we were focused on. The daily milestones were welcomed by both project management and the business customer who was involved. In addition, the test team was able to self organize around the clear deliverables for each day.

After my experience report, Baher Malek related three short experiences around increasing feedback. His three main techniques for increasing feedback (and really, his experience report covered mostly the first two) were:


  1. Maintaining the project into existence

  2. Joint elaboration

  3. Using code coverage



Baher's opening focused on what he called maintaining the project into existence. This included needing to allow for technology only iterations early in the project (infrastructure build out, software and hardware upgrades, etc...) as well as merging more new development iterations with the regular production support releases.

His second technique, joint elaboration, made me think a lot about what Brian Marick called example driven development. Joint elaboration is a process where the testers, developers, and business (specifically, the business person doing the customer acceptance testing) sit down together and develop the specific tests that will be used to drive acceptance. This allows the developers and testers to focus their development and testing on the specific scenarios that the user has helped specify. This topic also brought up some interesting discussion on anchoring bias and how it might affect the functional and user testing efforts.

Finally, Baher related a short experience using code coverage to drive some early feedback based on unit and functional test results. That was not nearly as interesting as the other two topics, so we didn’t talk about it as much. I'm encouraging him to write up both of his ideas on maintaining projects into existence and joint elaboration. Watch the IWST site for updates. If he writes something up, we'll like to it there.

After that, Mike Goempel related an experience where he came on to a large and out of control test project and had to reign things back in. Mike outlined the process he used. He certainly made it seem like a science (which I think it was given his context). He and his team were very systematic in rolling out changes and control across a test program.


  1. What's the project environment?

  2. What's our status?

  3. What are our capabilities?

  4. Where is the test leadership and how is it coordinated?

  5. Develop a (new) strategy and plans.

  6. Execute on those plans, rolling out large changes slowly.



Mike's experience report sparked a brainstorm on how we actually recognize when a test project is in trouble. The results of that can be found here.

Next month we try something new. We will host an open roundtable to discuss our findings (summarize the ERs, talk about the brainstorm, etc...) and talk about if and how we've implemented anything we've learned. This roundtable is open to anyone. More information can be found here.

The next workshop is in March on the topic of Security Testing Web Applications. Let me know if you might be interested in attending and/or sharing an experience.
IWST 2007
We have posted the schedule for the 2007 Indianapolis Workshops on Software Testing.

About IWST
The Indianapolis Workshops on Software Testing (IWST) are an ongoing series of no-cost workshops for experienced software testers and related professionals. IWST strives to builds skills in software testing and allows people who are interested in topics in software testing to network with their peers. The emphasis is on mutual learning, sharing hands-on experiences and solving practical problems. In these meetings, experienced working practitioners discuss pertinent state-of-the-art topics. IWST has been organized by a small team of local practitioners in the field of software testing.

We have a slight change in format for 2007. We are going to run 6 workshops and 6 lessons learned roundtables, alternating every other month. Workshops will be held on Saturdays and will run for five hours with regular breaks for networking and refreshments. Roundtables will be held on Tuesdays and will run for three hours (or until the energy runs out of the room) with two breaks for networking and refreshments.



Workshop Format
Each workshop is limited to 15 to 20 people. Each meeting will have a facilitator who manages discussion and presentation questions. Each meeting will have a content owner who determines the topic and the speakers for that topic. These two roles will serve to guide the discussion and format, but it is the attendees who ultimately determine what we focus on and what it interesting to them.

Each workshop will focus on one topic in software testing. We will try to select a topic that is narrow enough in scope that we can reasonably cover it in the limited amount of time we have. In the month following the workshop, there will be a lessons learned roundtable where we invite the workshop attendees back to talk about how they applied what they learned (see the details on the roundtables below). In the weeks/months following the workshop session, more discussion can take place on the topic via the IWST mailing list and discussion forum.

We limit the size of each IWST workshop to between 15 and 20 seats based on the facility and the demand. There is no fee to attend this workshop. Presentations are typically on advanced topics in software testing or on topics that we feel need to be discussed in the Indianapolis area. We strive to give every serious inquirer a fair chance to attend at least one of the year's workshops and we encourage newcomers who have a passion for the field to attend.

Roundtable Format
The roundtables are completely open. Anyone can attend. Each roundtable will have a facilitator who manages discussion and a content owner who manages the focus of the content. These two roles will serve to guide the discussion; they will not be as restrictive as the actual workshops. It is the attendees who ultimately determine what we focus on and what is interesting to them.

The roundtables are intended to serve two main functions. First, it gives us an opportunity to open IWST to more people. The workshops are limited in size to facilitate a particular style of participation. The roundtables are open to allow more people to get involved in what IWST is trying to accomplish. Second, we will invite the participants from the prior month's workshop (on the same topic) back to discuss how they applied what they learned on the topic.

More information on how to attend on the website.
IWSTMichael KellyComment