Posts in Software Testing
Gaining insight into your testing
In a recent post on How To Gain Insight, Peter Vajda shares his thoughts about what it takes to "gain insight and contact with inner wisdom and guidance." In the post, he lays out four "steps" for developing that insight. While I think the post is fairly generic, there are some lessons there that resonate with me.

For example, in step one, you have to care about what you're attempting to gain insight into. Not only that, but you have to care about it enough to shut out other sources for noise; at least enough to allow you to focus. The goal is to be "not distracted by thoughts, e-mail, cell phones, or other sources of ‘noise’."

I see this a lot in testing. Many testers sit down to test something, but don't take the time to create an environment where they can have insights into the testing they are doing. They don't create an environment free of noise. They often don't have an "intensity of purpose."

One way to help develop an intensity of purpose for your testing is to develop a clear mission for your testing. Tell yourself that for the next hour, two hours, or even thirty minutes, you'll be doing X. And not just some generic statement of work, but give your mission some life. Give it a story or specific purpose. Something that points to coverage and risk.

In step two, he talks about skills. I love this. It ties directly into the skills and tactics that have been outlined for exploratory testing. "If any specific knowledge or experience is required, then you must gain that knowledge and pursue that experience."

In step three, he talks about the ability to focus "in the now." This ties into some of the dynamics described in the skills and tactics for exploratory testing. It's thinking vs. doing, quick vs. slow, or touring vs. testing. "An insight most often arises when you suspend thinking and proceed without forcing."

In perhaps the best quote (for me) in the post, he points to practice:


"It’s a process that needs to be cultivated and practiced with consistency. No practice, no consistency, no insight."


Well said.
The making of an expert
In a recent issue of the HBR, I read The Making of an Expert by K. Anders Ericsson, Michael J. Prietula, and Edward T. Cokely. In the article, the authors outline their research on the topic of building expertise within a specific area. I've followed Ericsson's work in the past, and have written on this particular topic a couple of times here, here, and here.

One of the first things the authors point out in the article, is that it's not necessarily raw intelligence that makes one an expert. It's also the approach that one takes.


Subsequent research indicating that there is no correlation between IQ and expert performance in fields such as chess, music, sports, and medicine has borne out his findings. [...] Above all, if you want to achieve top performance as a manager and a leader, you’ve got to forget the folklore about genius that makes many people think they cannot take a scientific approach to developing expertise.


I like this quote because it fits with my philosophy of what makes good testers and how testers learn. I identify myself as a member of the context-driven community and as such I think most testers would be well served by developing a deeper understanding of scientific and systems thinking, heuristics and biases, and project and business management. I often encourage testers I work with to explore areas of interest outside of software testing and to bring something back to the field from those interests.

In particular, this quote reminds me of the work James and Jon Bach have done in the area of exploratory testing skills and tactics. In our class on exploratory testing, David Christiansen and I use those skills and tactics as a framework for building expertise in exploratory testing. We talk about how each skill can be developed, how you could identify if you were doing it wrong, and we provide exercises for getting better. I think James has done the most work in our industry in trying to develop a scientific approach to developing expertise in software testing.

However, an approach isn't enough. You also need to practice. And not just spending 30 minutes a day poking around software for bugs, but deliberate practice, where you work on a specific skill or tactic in your testing. It's the difference between playing a pickup game of basketball (which is one type of practice) and deciding to spend 30 minutes working on specific shots or working with the rest of the team on plays that involve two or more people working together.


Deliberate practice involves two kinds of learning: improving the skills you already have and extending the reach and range of your skills. The enormous concentration required to undertake these twin tasks limits the amount of time you can spend doing them. [...] It is interesting to note that across a wide range of experts, including athletes, novelists, and musicians, very few appear to be able to engage in more than four or five hours of high concentration and deliberate practice at a time. In fact, most expert teachers and scientists set aside only a couple of hours a day, typically in the morning, for their most demanding mental activities, such as writing about new ideas.


I know that when I'm writing about testing (blogging, articles, or even the book I'm working on), it's normally in the morning. I also know that when I'm trying to learn about a technique or some new technology, I don't just read about it, I have to do it. I need BICHOK time (but in chair, hands on keyboard - it's a term writers use). If it's a technique, I've got to either try it out a couple times on problems that are progressively more difficult. Or if it's a technology, I've got to try to implement something with it, or play around with some existing implementation to feel like I really understand it.

Sometimes my practice is structured. When I was learning Ruby I facilitated some internal courses at the company I was working at. And when I learn a new tool (like IBM Rational Quality Manager for example), I like to outline what my learning objectives are before I begin. That way I know the time I spend learning is more productive. When my practice is unstructured, I find that I go where my energy is. For example, when I was learning regular expressions for the first time (I say for the first time because I have to relearn them every few years), I alternated between Google searches for references, tools for pattern matching, and writing code. The choice of where to go at any given time was relatively arbitrary.

After practice, the authors talk about mentors:


Research on world-class performers has confirmed Galamian’s observation. It also has shown that future experts need different kinds of teachers at different stages of their development. In the beginning, most are coached by local teachers, people who can give generously of their time and praise. Later on, however, it is essential that performers seek out more-advanced teachers to keep improving their skills. Eventually, all top performers work closely with teachers who have themselves reached international levels of achievement.


My first mentor was a guy named James (Jim) Tate. He was the first test manager I worked for, and did a great job of opening my eyes to some of the struggles that testers often face. He also spent a lot of time working with me, helping me understand some of the system factors affecting our testing. I learned a lot about developer/tester and tester/project manager relationships from Jim.

After Jim, I ran into the context-driven community via the Austin Workshop on Test Automation (AWTA). There I met James Bach. James has been kind enough to spend one-on-one time working with me, helping me better understand the practice of software testing. His focus is on the mechanics - what we actually do as testers. He's helped make me a better tester. He's also been a strong role model for how I like to consult.

I would say the next two big influences have been Julie DeSutter and Charles Priller. They've helped me understand what's needed to successfully build and lead a team. Much of my management style comes from them, and many of my influences are authors that they've recommended. When I don't know what to do, I still turn to them for advice.

As I look back at the last several years, I can identify the largest influences easily. But more interesting than the large influences, are the small ones. The people you learn from on a daily basis. There are hundreds of people who have taken the time to teach me something (often knowingly, sometimes not). A coach isn't always a formal role. Sometimes it is, but you can let people coach you even without that.

For example, much of what I've learned about testing from Cem Kaner isn't from speaking with him. It's from watching his videos, reading his writings, and teaching a course using one of his books. Now, I don't talk with Cem on a daily basis, but we talk on occasion. You'd think we'd talk about testing more, but that's just not our relationship. It's odd for me, because I feel like I've learned so much from him. In testing he has been a coach of mine, just not in the traditional way. It's only been the last couple of years where he's started to actively coach me.


The development of expertise requires coaches who are capable of giving constructive, even painful, feedback. [...] The elite performers we studied knew what they were doing right and concentrated on what they were doing wrong. They deliberately picked unsentimental coaches who would challenge them and drive them to higher levels of performance. The best coaches also identify aspects of your performance that will need to be improved at your next level of skill.


I like painful feedback. If you ask for it, there are people who will give it to you. And sometimes it's not just coaches, but communities. There are plenty of lists and forums where you can get feedback. There are also plenty of workshops where the feedback can be honest and constructive. I prefer face-to-face feedback, but I'll take it any way I can get it.

When I look for coaches, I look for people who are willing to give me difficult feedback, are patient enough to work with me through my often knee-jerk response to that feedback (which is often defensive), and who can provide guidance with what I can do to improve my performance. Sometimes the advice is a reference for something to read, sometimes it's an exercise for practice, and other times it's working through a problem together.

Overall, it was a good article. For me, it reinforced some of the ideas I have around practice and mentoring. As always, I'd be interested in other people's experiences with practice and mentors.