Posts in Events
Software Testing Lightning Talks from IWST
Last month we held the November session of the Indianapolis Workshops on Software Testing (IWST). At the workshop we tried something a bit new. Instead of working with the same topic for five hours, we had each attendee present at least one short five minute talk on a topic of their choice. Some attendees presented more than once. The participants in the workshop were the following:

  • John Galligan

  • Mike Goempel

  • Jason Horn

  • Michael Kelly

  • Ananth Krishnamoorthy

  • Panos Linos

  • Patrick Milligan

  • Jayadev Reddy

  • Chris Scales


In addition to the format switch-up, we also introduced some podcast equipment into the room to capture the talks and the follow up discussion. Below, you'll find links to the audio for each talk along with some notes the IWST organizers put together to help people follow up on some of the topics discussed.

MCOASTER by Mike Kelly

Experiences Testing In Scrum by Patrick Milligan

More Experiences Testing In Scrum by Patrick Milligan

  • The basics of Scrum on Wikipedia.

  • Burn-down Charts on Wikipedia.

  • Checkout the TWiki website for more information on that tool.

  • Checkout the Bugzilla website for more information on that tool.

  • The MoSCoW acronym on Wikipedia.

  • Checkout the Subversion website for more information on that tool.

  • Checkout the Watir website for more information on that tool


Screen Recording APIs by Jason Horn

Testing Mobile by Jayadev Reddy

  • The topic of using emulators for testing came up during the discussion. This post contains a nice listing of some emulators out there today with links for where to find them.


Challenges With Hiring Testers by Mike Goempel

Book Review: How To Break Web Software by Chris Scales

One Trick Ponies by Mike Kelly

  • Checkout the Stella website for more information on that tool.

  • Checkout the Website Grader website for more information on that tool.

  • Checkout the BrowserMob website for more information on that tool.

  • Checkout the spoon website for more information on that tool.

  • Checkout the WebPagetest website for more information on that tool.

  • Checkout the Browsershots website for more information on that tool.

  • Checkout the Web Developer extension website for more information on that tool.

  • During the talk Mike mentioned the concept of blink testing.

  • Checkout the QuickTestingTips.com blog for short writeups on these (and similar) tools.


Testing In The Cloud by Jason Horn

  • Jason works for BlueLock, a company that offers hosting/services for the cloud. You can learn more about their products on their website.

  • Jason mentioned VMware and Microsoft Hyper-V as examples of private clouds.

  • Jason talked a bit more about Sauce Labs. Check out their website for more on their hosted Selenium testing services. (There is a free plan available.)


ROI of Test Automation by Ananth Krishnamoorthy

Heuristics For Creating Automated Regression Tests by Mike Kelly

  • Mike mentioned that this content was co-developed with David Christiansen.

  • For more on all pairs testing (and other combinatorics techniques) checkout the Pairwise Testing website.

  • During the questions and answers, Pat Milligan mentioned the tool Hudson. Learn more about Hudson and continuious integration on the Hudson website.


The Omega Tester by Jason Horn

If you like the format, you're in luck... We think we'll be doing something like this a couple times in 2011. Enjoy the audio and if you have any questions or suggestions let me know. You can also help us plan the 2011 workshop topics on the IWST Meetup site, under Ideas.
Software Testing Centers of Excellence (CoE)
This weekend we held the September session of the Indianapolis Workshops on Software Testing (IWST) at Butler University. The topic of the five-hour workshop was software testing Centers of Excellence (CoE). The participants in the workshop were the following:

  • Andrew Andrada

  • Patrick Beeson

  • Howard Clark

  • Matt Dilts

  • Randy Fisher

  • Mike Goempel

  • Rick Grey

  • James H. Hill

  • Michael Kelly

  • Panos Linos

  • Natalie Mego

  • Hal Metz

  • Patrick Milligan

  • Charles Penn

  • Brad Tollefson

  • Bobby Washington

  • Tina Zaza


We started the workshop by going around the room and asking each person to comment on what they thought a Center of Excellence was, and what their experience was with the topic. In general, there were only a handful of people who had either worked in a formal Center of Excellence or had experience building one out. The overwhelming feeling in the room was one of, "I'm here to learn more about what people mean when they use that term." and "It all sounds like marketing rubbish to me." Okay, perhaps that rubbish part was me embellishing, but I think other than me thought it - even if they didn't phrase it that way.

The first experience report came from me. I briefly presented Dean Meyer's five essential systems for organizations and shared some experiences of how I've used that model to help a couple of clients build out or fix their testing Centers of Excellence. I use the ISCMM mnemonic to remember the five systems:

  • Internal Economy: how money moves through the organization

  • Structure: the org chart

  • Culture: how people interact with one another, and what they value

  • Methods and Tools: how people do their work

  • Metrics and Rewards: how people are measured and rewarded


If you're not familiar with Meyer's work, I recommend his website or any of the following short but effective books on the topic:

I didn't really provide any new insights into how to use the five systems. If you read the books, or review the material on the website, you'll see that Meyer uses these systems help diagnose and fix problems within organizations. That's how I use them as well - I just focus them on problems in testing organizations. I then provided some examples of each from past clients.

Meyer also spends a good deal of time talking about "products." A product is what your organization offers to the rest of the organization. In a testing CoE, that might be products around general testing services, performance testing, security testing, usability testing, or test automation. Or it might be risk assessments, compliance audits, or other areas that sometimes tie in closely with the test organization. I personally use this idea of products as a quick test for identifying a CoE.

Meyer defines products as "things the customer owns or consumes." In his article on developing a service catalog, he points out that:
"...an effective catalog describes deliverables -- end results, not the tasks involved in producing them. Deliverables are generally described in nouns, not verbs. For example, IT sells solutions, not programming."

I believe that if your organization does not offer clear testing products, then it's not a CoE. It's just an organization that offers staff augmentation in the area of software testing. There is no technical excellence (in the form of culture, methods and tools, or metrics and rewards) that it brings to bear in order to deliver. To me, the term Center of Excellence implies that the "center" - that is the organization which has branded itself as excellent in some way - has some secret formula that it bakes into its products. It they delivers them to the organization by delivering those products.

After my experience report, Randy Fisher offered up his experiences on vendor selection criteria. Randy's company ( a large insurance company) is going through the process of deciding if they should build a CoE themselves, or if they should engage a vendor to help them build out the initial CoE. For Randy and his team, the business case for moving towards a CoE is to allow them to leverage the use of strategic assets (people, process and technology) to achieve operational efficiencies, reduce cost, improve software quality, and address business needs more effectively across all lines of business.

Randy and his team started with an evaluation pool of several vendors, and using the following weighted criteria narrowed that list down to two key vendors:

  • Understanding of company’s objectives

  • Test Process Improvement (TPI) Strategy

  • Assessment phase duration

  • Output from Assessment phase

  • Metrics/Benchmarking

  • Experience in co-location

  • Risk Based Testing Approach

  • Standards, Frameworks, Templates

  • Consulting Cost

  • Expected ROI

  • Expected Cost Reduction

  • Special Service Offerings/ Observations


After this initial evaluation, Randy offers the following advice for those who are looking to undergo a similar exercise:

  1. Have specific objectives in mind based on your organization when you meet with the vendors (this list contains a sampling of what I used…)

    • Create a benchmark (internally across systems and with peers in the industry) to facilitate ongoing measurement of organizational test maturity

    • Develop a roadmap for testing capability and maturity improvement

    • Leverage experience and test assets including: standards, frameworks, templates, tools etc.

    • Assess the use of tools, and to a perform a gap analysis to determine the need for additional tooling

    • Define touch points and handoffs between various groups (upstream/downstream) as they relate to testing

    • Assess test environments and create the appropriate standards and tools to address preparation, setup & maintenance

    • Utilize the knowledge of the vendor (both functional and insurance) to facilitate the creation of an enterprise test bed and test data management process

    • Assist with the improvement of capacity planning for test teams

    • Document the test strategy and process differences between groups



  2. Choose your selection criteria based on that factors that are important to you – nobody knows you like you do...

  3. Talk to as many vendors as you can.

  4. Don’t be afraid to negotiate cost and participation level for the engagement.


During the discussion that followed Randy's experience report, there were some interesting questions asked about his goals. That is, what pain are they trying to solve by moving to a CoE? Randy indicated that predictability (times/dates, quality, etc...) were big factors from a project perspective. He also indicated that he wanted his testers to have better tools for knowledge sharing. At the end of the day, he hopes a CoE makes it easier for them to do their jobs. Hal Metz had an interesting insight that for him, the goal should be to create an organization that enables the testers to increase their reputation (either through technical expertise or ability to deliver).

After Randy's experience report, Howard Clark shared an actual example of a slide deck he helped a client prepare to sell a test automation CoE internally. The slide deck walked through step-by-step what the executive would need to address and how building out the CoE would add value in their environment. I'd LOVE to share the slides, but can't. Howard has committed to distil those slides down in either a series of posts on his blog, or in a doctored set of slides. Once I get more info, I'll post an update here.

Either way, I think Howard's talk did a great job of moving the conversation from the abstract to the specific. This was a real business case for why they should build one, what it should look like, and what the challenges would be. I liked it because it used the client's language and addressed their specific concerns. That's one reason why I'm sort-of glad he can't share the slides. It's so specific, it would be a tragedy for someone to pull down those slides and try to use them in their context.

That idea, CoE's are always specific to a particular company's context, was something Howard tried to nail home throughout the day during his questions and comments. I think it's a critical point. No matter what you think a CoE is, it's likely different from company to company. And that's good. But it creates a fair bit of confusion when we talk about CoEs.

Finally, when we were all done presenting, Charles Penn got up and presented a summary of some of the trends he noticed across the various talks and discussion. In no particular order (and in my words, not his):

  • The building out of a CoE almost necessitates the role (formal or informal) of a librarian. Someone who owns tagging and organizing all the documents, templates, and various other information. It's not enough just to define it and collect it - someone has to manage it. (Some organizations call them knowledge managers.)

  • CoE seems largely to just be a marketing term. It means whatever you want it to mean.

  • There seems to be a desire to keep ownership of CoEs internal to the company.

  • There are assorted long term effects of moving toward a CoE model, and those need to be taken into account when the decision is made. It's not a 6 month decision, it's a multi-year decision.

  • There seem to be A LOT of "scattered" testers. That is, testers who are geographically dispersed within the various companies discussed. A large focus of the CoE model seems to be finding ways to deal with that problem.


There were more, but I either didn't capture them or couldn't find a way to effectively share them without a lot of context.

All said and done, it was a great workshop. We had excellent attendance and Butler was great. I hope they have us back for future workshops. We now need to start the planning for 2011. Our current thoughts are for around four workshops. We already have one topic selected given the amount of energy for the topic (teaching software testing - I'll need to let the WTST people know we are doing a session on that), but that leaves three workshops currently up in the air. I'd like to try to do one on testing in Rails, but given how the one earlier this year fell flat, perhaps that's not a good topic.

If you'd like to know more about IWST, checkout the website: www.IndianapolisWorkshops.com

If you'd like to participate next year or have ideas for a topic, drop me a line: mike@michaeldkelly.com