- Andrew Andrada
- Charlie Audritsh
- Howard Clark
- Michael Goempel
- Jason Horn
- Karen Johnson
- Michael Kelly
- Steve Lannon
- Kelvin Lim
- John McConda
- Scott McDevitt
- John Montano
- Vishal Pujary
- David Warren
- Gary Warren
The topic we focused on for the five-hour workshop was 'techniques for exploratory testing'.
The workshop started with John McConda providing an overview of how Mobius Test Labs runs their exploratory test sessions. Early in the talk, John covered the basics of Session Based Test Management (SBTM). He then gave us an in-depth look at a SBTM tool that Mobius has been working on to help them manage their testing. The tool allowed multiple users to manage their sessions, their results, and auto-generated results summaries and metrics.
After John, David Warren presented an experience where he did exploratory testing on an agile project where he worked closely with the developers and customers. His talk was interesting in both it's exploratory testing content as well as it's agile content. David shared how he worked with the customer to help them structure their testing. He spoke about the difficulties of implementing automation when he knew he would be leaving the project and the customer would have to take over the maintenance of the scripts.
After David, I presented some of the heuristics I use when I test. I had a handout from a past talk and I just recycled it for the workshop. I probably could have updated it, but I didn't. I also referenced the Satisfice Heuristic Test Strategy Model, which I use a lot in my testing. For each heuristic, I tried to give a small example of how I've used it in the past. Hopefully it was helpful. I had fun talking about it.
After my experience report, we decided to skip Jason Horn's experience report in favor of an activity. Hopefully Jason can attend the Round-table next month to present his experience there. The activity involved paired exploratory testing for an application no one in the room (other then me) had seen. We tested the Open Certification question server (under development by Tim Coulter). Tim was kind enough to give us access to his development stream and gave us an idea of what types of bugs he would be interested in from us.
Armed with a mission given to us by the project coordinator, we paired off and got some testing done. There were seven groups. Each group shared a laptop, and some of the groups turned on BB TestAssistant to record their testing. We captured a couple of videos that you can watch here, here, here, and here. I find them easy to watch on 4x speed.
After about 25 to 30 minutes of testing, we stopped and debriefed as a group. We attempted to capture what techniques and tools were used during testing. We came up with the following list (there are some duplicates and incomplete ideas in there). At next month's round-table, I want to review this list with the group to see what patterns we can draw out of it.
Finally, I want to again remind workshop attendees to please log their bugs. I know some people already have. Tim would appreciate the feedback.
This month's IWST was hands down the most fun I've had at a peer workshop. I think we had an incredible energy level, we ran out of time (we skipped an entire ER and stopped the testing exercise 30 minutes early), and we had a full house. Thanks to everyone who attended. I hope to see you all again at next month's round-table where we can pick it up again. If you would like to attend the round-table, details can be found here. Round-tables are completely open to everyone. Just show up...