Posts in Test Management
CoE's might need sales people too
Yesterday I read this article by Jill Konrath on 7 Sales Mistakes Guaranteed to Make Your New Service Fail. It reminded me on when I was working to build out a centralized testing group in a large organization. In many ways, I was the business developer for my team within the organization. We may have been a Center of Excellence (CoE), but we weren't really required to be used. We had to earn our business.

The tips in the article that resonated with me the most were:

Setting up meetings to update customers about the new product or service can lead to trouble. Arranging the meeting isn't the mistake—just its premise. If sales reps tell customers they're bringing information about the new product or service, that's exactly what customers expect the meeting to be about. Sellers then find it exceedingly difficult to switch into a questioning mode—an essential step for determining valid business and financial reasons for changing. Instead they're expected to talk, talk, talk—and boy, do they ever!


and

If salespeople don't have a clearly defined next step implanted in their brains prior to the call, they are doomed. Just sharing exciting new product information gets sellers nowhere. Unless they have a clearly defined objective before the call and are ready to offer logical next steps, they'll be left sitting by the phone waiting for it to ring.


We "sold" automation and performance testing "products" to project teams. Getting teams to use us, and getting them to pay for enhancements to the products we provided, was in every way a sales call. Good advice - read the entire article.
Questions to help clarify test status
Stealing from a post I did on SearchSoftwareQuality.com, here are some questions I use to clarify testing status when doing debriefs:

  • What was your mission for this session?

  • What did you test and what did you find?

  • What did you not test (and why)?

  • How does your testing affect the remaining testing for the project? Do we need to add new charters or re-prioritize the remaining work?

  • Is there anything you could have had that would have made your testing go faster or might have made your job easier?

  • How do you feel about your testing?

Confirm estimates with a different approach
When estimating, I often start with a bottom up approach. That is, take the individual tasks, estimate each of them, and look at the total for the project. Once that's done, I try to step back and check my estimate with a top-down approach. In that approach, I ask how many people I think I need, for how long, and see what the numbers work out to. I'll keep going, making changes to both estimates, until the numbers converge. It's a nice simple way for me to check for the reasonableness of any estimate. There are other techniques that can be used as well. What's important isn't the technique, it's more that you're doing some sort of double-check before you make the commitment.
Knowing the details
I've seen a lot of test managers (and/or testing project managers) who don't know the details of what's happening in their projects. They attend the meetings, keep plans up to date, give out assignments to the team, and make sure all the charts are up to date. But they never really develop a working knowledge of the project or a feel for the testing. I don't like to work that way. When I'm managing a testing project, I want to know the details.

There are a couple of key ways I do this:

  • I try to contribute to writing, editing, or (at a minimum) reviewing all the test strategies, plans, and schedules (many of the larger projects I've worked on have several).

  • I review test cases, scripts, charters, or results (depending on the team's approach). I also review the test data being used.

  • I read the defects - all of them. I want to know where the issues are, what they are, and what they look like.


I think developing my understanding of the defects is the most important aspect for me. It helps me develop an intuition about the software and it's state. It also keeps me in tune with what work is actually taking place. They other stuff is important, but if I really want to know what's going on, I look at what's getting executed and what's coming out of that work.
Is your testing process this clear?
In my opinion, one of the biggest success factors for a centralized testing organziation is a clear engagement model. One that starts with new project intake and ends with project closeout, follow-up, and portfolio management. I've tried to tackle this problem at two different organizations, and at each one I struggled to get clear and easy to follow definition around what we did.

Take a look at how HUGE laid out their process. I understand that it's marketing material, so it's obviously going to be pretty, but look past that. They lay out seven clear phases for engagement. Each phase has a summary of the steps that will be undertaken in that phase.

If you're centralized group offers products and services, imagine that your products and services are listed next to those phases. Those are the various products and services teams could expect during each phase. I find it to be a very simple visualization for what your organization might offer to project teams.