Posts in IWST
Techniques for Exploratory Testing
Earlier this month we held the July session of the Indianapolis Workshop on Software Testing. The attendees were:


  • Andrew Andrada

  • Charlie Audritsh

  • Howard Clark

  • Michael Goempel

  • Jason Horn

  • Karen Johnson

  • Michael Kelly

  • Steve Lannon

  • Kelvin Lim

  • John McConda

  • Scott McDevitt

  • John Montano

  • Vishal Pujary

  • David Warren

  • Gary Warren



The topic we focused on for the five-hour workshop was 'techniques for exploratory testing'.



The workshop started with John McConda providing an overview of how Mobius Test Labs runs their exploratory test sessions. Early in the talk, John covered the basics of Session Based Test Management (SBTM). He then gave us an in-depth look at a SBTM tool that Mobius has been working on to help them manage their testing. The tool allowed multiple users to manage their sessions, their results, and auto-generated results summaries and metrics.

After John, David Warren presented an experience where he did exploratory testing on an agile project where he worked closely with the developers and customers. His talk was interesting in both it's exploratory testing content as well as it's agile content. David shared how he worked with the customer to help them structure their testing. He spoke about the difficulties of implementing automation when he knew he would be leaving the project and the customer would have to take over the maintenance of the scripts.

After David, I presented some of the heuristics I use when I test. I had a handout from a past talk and I just recycled it for the workshop. I probably could have updated it, but I didn't. I also referenced the Satisfice Heuristic Test Strategy Model, which I use a lot in my testing. For each heuristic, I tried to give a small example of how I've used it in the past. Hopefully it was helpful. I had fun talking about it.

After my experience report, we decided to skip Jason Horn's experience report in favor of an activity. Hopefully Jason can attend the Round-table next month to present his experience there. The activity involved paired exploratory testing for an application no one in the room (other then me) had seen. We tested the Open Certification question server (under development by Tim Coulter). Tim was kind enough to give us access to his development stream and gave us an idea of what types of bugs he would be interested in from us.

Armed with a mission given to us by the project coordinator, we paired off and got some testing done. There were seven groups. Each group shared a laptop, and some of the groups turned on BB TestAssistant to record their testing. We captured a couple of videos that you can watch here, here, here, and here. I find them easy to watch on 4x speed.

After about 25 to 30 minutes of testing, we stopped and debriefed as a group. We attempted to capture what techniques and tools were used during testing. We came up with the following list (there are some duplicates and incomplete ideas in there). At next month's round-table, I want to review this list with the group to see what patterns we can draw out of it.

Finally, I want to again remind workshop attendees to please log their bugs. I know some people already have. Tim would appreciate the feedback.

This month's IWST was hands down the most fun I've had at a peer workshop. I think we had an incredible energy level, we ran out of time (we skipped an entire ER and stopped the testing exercise 30 minutes early), and we had a full house. Thanks to everyone who attended. I hope to see you all again at next month's round-table where we can pick it up again. If you would like to attend the round-table, details can be found here. Round-tables are completely open to everyone. Just show up...
Starting a peer workshop

Someone recently emailed me and asked me about starting their own peer workshop. I have a small amount of experience in the topic. I've run a number of IWSTs (a small local workshop) and two WOCs (a longer three-day workshop). I've attended many other peer workshops, including WHET, WOPR, WOCT, STMR, STiFS, and AWTA. They are all in some way or another in the LAWST-style.

Here are bits of my reply:



Facilitation
Depending on your personality type, you might be able to facilitate. I don't know what personality types make good facilitators, I just know not everyone is a good facilitator. If you ever need a facilitator, and your workshop is LAWST-style, there is a good chance AST will provide one for you.

I recommend two books:
- Facilitator's Guide to Participatory Decision-Making
- How to Run Seminars and Workshops

To get AST sponsorship, follow the LAWST-style and submit an email to the AST VP of Conferences (conference@associationforsoftwaretesting.org). It's that easy.

Money
Plan on a three-day workshop costing you around $1,500 out of pocket all said and done. You can do it a bit cheaper or a bit more expensive - depending on the market and your connections in that market.

Content
Be clear on your expectations on what you want the outcomes to be. That will help guide your invitations and content selections. Scott Barber has noticed some interesting dynamics on this topic. I hope he posts them as a comment to this post.

Be very clear about what it takes to attend. If it's invitation only - make it invitation only. Be painfully clear about what that means. Does that mean I can just email you to get an invitation? Say that. This is the largest barrier to entry for people attending. If they feel it's 'closed', they won't try to come. If they feel it's only for the 'elite', they won't try to come.

Marketing
Get a website. Right away. Unless you want a small community, you need a public face. WOPR has a great marketing group (Ross, Roland, and Scott) and they do an /excellent/ job at getting new blood into the workshops.

Send the call for participation well in advance. Send it more then once.

Keep a relatively constant stream of communication to your attendees. Send emails 60 day, 30 days, 15 days, and 7 days before the event. Start a YahooGroup following the workshop for veterans.

Struggles
Get ready for rejection. It's the hardest part. People commit and then they back out at the last minute. I find that the hardest part.

Be ready for people not to publish anything following the workshop. It takes someone special to actually follow up on workshop promises. This is the second hardest part. (I still have outstanding promises to AWTA from January. I have good intentions. It's hard...)


If you have other tips (many readers of my blog have attended or run their own peer conferences) please feel free to post them.