Posts in Experience Report
Stapler Testing
At PNSQC this week I attended Johanna Rothman's workshop titled Interviewing with Ease (based on her new book Hiring The Best Knowledge Workers, Techies & Nerds: The Secrets & Science Of Hiring Technical People.) During the workshop, a group of gentlemen from Envision mentioned a technique that they sometimes used during an interview to assess an interviewees testing abilities. Their technique? They ask the interviewee to test a stapler.



I found this to be quite an interesting challenge. How would you test a stapler? It's simple and easy to understand (no industry specific knowledge required). There is no underlying technology you need to know. Heck, odds are you're even an expert user!

Well I thought I would take a whack at it. The following is my attempt to test a stapler. If you actually have the patience to read all of the tests I thought of, then please comment on tests that I missed (I'm curious to see how complete my list is). If you don't read all of them, that's fine too. But scroll to the end and read the last couple of paragraphs after the list where I talk about the significance of this exercise.

I chose to test the Stanley Bostitch B660-Black Anti-Jam Desktop Stapler. Following the exploratory process, I thought I would take a minute and learn about the company whose stapler I'm testing and a little bit about staplers in general.

Based on that information, and that information alone, I came up with the following tests:

Requirements based testing:
Dimensions:
1. Is the stapler 1 inch and/or 43.18 mm tall?
2. Is the stapler 2 inches and/or 61.72 mm long?
3. Is the stapler 6 inches and/or 176.28 mm wide?
4. Does the stapler weigh .5 lbs and/or 0.23 kilos?
5. When packaged, are the packages dimensions 1.73" x 6.79" x 2.45"?

Color:
6. Is the stapler black?
7. Is it ok for the anvil and the staple magazine to be silver?
8. Is it ok for the company logo on the front to be white? (After all, the logo on the top is in black...)

Features:
9. Does the stapler open 180 degrees for tacking?
10. Does the anvil have a clincher for curling staples in?
11. Does the anvil have a clincher for curling staples out?
12. Can I change the anvil settings (most likely by rotating the anvil)?
13. Can I see the staple supply indicator?
14. If the stapler is full, does the stapler indicator show staples?
15. If the stapler is empty, does it not show staples?

A quick glance at the stapler tells me that I can see seven staples in the indicator, and that the indicator does not show the first staple in the stapler. Based on this I can ask the following questions:
16. Is it ok at the staple indicator reads empty with fewer than five staples?
17. How many staples should the staple indicator show?
18. Should there be a staple indicator on both sides of the stapler (there are currently, is this redundant)?

19. Does the stapler hold 210 standard staples?
20. Does the stapler hold 211 standard staples?
21. What are the defined dimensions of a standard stapler? (I tried to find this but I had a problem. I can find the staple "leg" size, but not the standard length and width.)
22. What is Anti-Jam technology? (I looked but could not find anything specific.)
23. What is an acceptable jam rate to be called Anti-Jam? (For example: is it one jam per 100,000 staples?)
24. What factors affect the Anti-Jam technology?

Warranty:
25. What constitutes normal wear (for the stapler as a whole)?
26. How did we gather the normal wear statistics (for the stapler as a whole)?
27. Based on normal wear statistics (for the stapler as a whole), set up a test lab and a mechanized staple apparatus (a test harness if you will) to push multiple staplers to test the affects of our normal wear requirements. What do our normal wear performance results look like (for the stapler as a whole)?
28. What do we define as neglect?
29. Where did we get our definition of neglect?
30. What tests can we run to test neglect? (For example: If we define neglect as leaving your stapler in a bathtub full of water (do not ask how it got there!) for three weeks and then returning it because it rusts, should we test for the effects of rust on our product?) (I see many tests relating to neglect, but I need to know more about what we mean by neglect.)
31. What do we define as abuse?
32. Where did we get our definition of abuse?
33. What tests can we run to test abuse? (I see many tests relating to abuse, but I need to know more about what we mean by abuse.)
34. What do we define as an accident?
35. Where did we get our definition of an accident?
36. What tests can we run to test accidents? (I see many tests relating to accidents, but I need to know more about what we mean by accident.)
37. What is the normal wear on a driver blade? How long should one last?
38. Based on normal wear statistics, set up a test lab and a mechanized staple apparatus (a test harness if you will) to push multiple staplers to test the affects of our normal wear on driver blades. What do our normal wear performance results look like?
39. What is the normal wear on a bumper? How long should one last?
40. Based on normal wear statistics, set up a test lab and a mechanized staple apparatus (a test harness if you will) to push multiple staplers to test the affects of our normal wear on bumpers. What do our normal wear performance results look like?
41. What is the normal wear on an o-ring? How long should one last?
42. Based on normal wear statistics, set up a test lab and a mechanized staple apparatus (a test harness if you will) to push multiple staplers to test the affects of our normal wear on o-rings. What do our normal wear performance results look like?

Packaging:
43. Is there a copy of the requirements in the packaging?
44. Is there a copy of the warranty in the packaging?
45. Is the product clearly identified on the packaging?
46. Are the packages dimensions 1.73" x 6.79" x 2.45"?
47. Is the packaging easy to open for our standard consumer?
48. Is our packaging harmful to the environment?
49. Does our packaging contain any claims we did not test against?
50. Does the packaging/website/catalog show similar information?
51. Is the packaging and warranty printed in the same language as the stapler is sold?

Parallel testing:
52. How well does this stapler compare to other staplers we make?
53. How well does this stapler compare to our competitors stapler?
54. How do our performance stats (see below) correspond to our competitors?

Scenario testing:
55. Use the stapler in a typical corporate office day for your average knowledge worker.
56. Use the stapler in a typical office day for a secretary.
57. Use the stapler in a typical day for a teacher.
58. Use the stapler in a typical day for a student.
59. Use the stapler in a typical day for a waiter.
60. Use the stapler in a typical day for a checkout clerk.
61. Use the stapler in a typical day for shipping yard attendant.
62. Use the stapler in a typical day for an event promoter.
63. Use the stapler in a typical day for a construction foreman.
64. Use the stapler in a typical day for an editor.
65. Use the stapler in a typical day for a writer.

Usability testing:
66. Is the design self-consistent?
67. Is the feature set the minimum necessary to meet the stated requirements?
68. Is this product similar to competitor's products with the same or similar requirements?
69. Do we have customer complaints on this or similar products that we can use as tests?
70. Do we have customers who can beta test this product for us?
71. If the standards for a "standard" staple change, how hard will it be to retrofit our staplers?
72. How long does it take a user to re-load the stapler?
73. How long does it take a user to figure out how to tack the stapler?
74. How long does it take a user to figure out how to use the stapler?
75. Does our stapler contain text telling us what type of staples it takes?
76. Does our stapler contain text telling us what make and model number it is?
77. Did the users want an automatic stapler?
78. Is this the most attractive color of black? Should it be flat instead of gloss?
79. Is the font on the packaging the right type and size?
80. If our stapler comes back to us with a defect, how hard will it be to fix and ship back to the customer?
81. Who are our users? What are they looking for in a stapler?
82. In what environment will they be using our stapler?

Function testing:
83. Is this an appropriate implementation of stapler technology?
84. Does it have a base?
85. Does it have a body?
86. Does it have a rear pivot?
87. Does it have a pivot spring?
88. Does it have the ability to engage a tacking function?
89. Does it have a staple magazine?
90. Does it have a staple pusher?
91. Does it have a magazine tension spring?
92. Does it have a ramhead?
93. Does it have a drive blade?
94. Does it have a staple exit?
95. Does it have an anvil?
96. How loud is the sound of the stapler stapling? Is this acceptable?
97. What happens if I staple materials other then paper?
98. What happens if I staple nothing at all?

Fault Injection:
99. If I remove the rubber sheet from the base, will the stapler still work?
100. If I remove the rubber sheet from the base, will the stapler slip off my desk while I apply pressure to staple something?
101. If I remove the spring that rotates the anvil, can I still rotate the anvil?
102. If I remove the spring that rotates the anvil, will the anvil remain on the desired setting?
103. If I remove the spring that rotates the anvil, will the stapler still staple correctly?
104. If I flatten one of the small metal tabs that keep the stapler from tacking, is the other tab sufficient to keep the stapler from tacking?
105. If I remove both of the small metal tabs that keep the stapler from tacking, does the stapler still staple correctly?
106. If I remove the spring attached to the staple pusher, can I still staple?
107. If I remove the staple pusher head, can I still staple?
108. If I remove the drive blade, can I still staple?
109. If I remove the anvil, can I still staple?
110. If I remove the body, can I still staple?
111. If I remove the o-rings, can I still staple?
112. What happens if I clog the staple exit? How big of an obstruction does it need to be to stop the stapler from working?
113. If a staple is jammed, how easy is it to remove the staple?
114. If I remove the spring below the staple magazine, can I still tack?
115. If I remove the spring below the staple magazine, can I still staple?
116. If I remove the spring above the staple magazine, can I still staple?
117. If I bend the staple magazine, can I still staple?
118. What if I use a staple other then a standard staple?
119. What if I staple something other then a paper-based product (plastic, wood, etc...)?

Stress testing:
120. At what temperature does the plastic melt? Is that acceptable?
121. At what cold temperature does the stapler cease to function (springs, rear pivot, etc...)? Is that acceptable?
122. How many pounds of pressure, while using the stapler in the recommended way, does it take to break the materials of the stapler? Is that acceptable?
123. What different types of staples will work in the stapler?
124. Does the stapler work under water?
125. Does the stapler work in the desert or on the beach (effect of sand on moving parts)?
126. Does the stapler conduct heat (what if I leave it on the stove and then try to move it)?
127. Does the stapler conduct electricity?
128. How impact resistant is the stapler?
129. If I drop it from my desk what happens?
130. What if I drop it from the second story?

I bet you never thought of that as impact analysis!

Performance testing:
131. How long does it take a user to switch to tacking mode?
132. How long does it take a user to switch to standard mode?
133. How many staples can be stapled in the stapler's use-full life?
134. How many times can the anvil be rotated before there is a noticeable degradation in the anvil spring?
135. How long does it take to successfully complete a single staple in tacking mode?
136. How long does it take to successfully complete a single staple in standard mode?
137. How many times can the stapler be switched between standard and tacking mode before there is a noticeable loss of "grip" on the tacking limiting tabs?
138. How many times can the stapler be reloaded with staples before there is a noticeable loss of functionality on the staple pusher spring?
139. How many staples can a drive blade drive before there is a noticeable affect of stapler performance (loss of ability to drive staples, clogs, etc...)?
140. In tacking mode, how far does a staple travel if shot in the air? (I don't know if this is a requirement, but I want to know if I desire to sell this to college students for staple fights.)
141. How many pieces of "standard" paper can be stapled before there is a noticeable effect on performance?

Regulations:
142. Is there some governing body that must approve the safety of our product?
143. Do we require any documentation for them as proof?
144. Does our stapler tell us what country it was manufactured in?

And that's all I came up with in the hour I had allotted myself (I was assuming I had an hour in the interview).

What I hope to illustrate by all this is the following. This is a simple stapler. I came up with 144 tests in about an hour. I'll grant that some of the tests are impractical, and some are probably not worded the best or need more information, but it's still 144 tests. Your average piece of software is infinitely more complex then a stapler. And odds are your software interacts with more then one user at a time, relies on more then one input, has more then two states (maybe four if you're picky), has more then a handful of features, and runs in an operating environment with things happening then your top desk drawer.

As testers, we are faced with an impossible task. We can't possibly test everything, even if we knew what everything was we needed to test. Our job is one of discovery. That we can look at a stapler and ask how it could be better, look at an online shopping cart and ask how it could be more secure, or how a life-support system can be more reliable, is a wonderful thing! Someone pays you to break stuff, give developers a headache, and if you're the bottleneck for your project - you get all the attention! Why doesn't everyone want to be a tester?

Either way, today I asked the question "how far does a staple travel, if shot in the air?" And that makes today a good day for me.
Parallel Testing
Because I'm suspicious of all software (comes with the territory of being a tester I guess), over the weekend I decided to check my credit report to see exactly what I had on my record. I do a lot online (purchasing, bill-pay, etc...) and I'm somewhat paranoid about security and identity theft, because I know how easy it is to crack most security and/or gather someone's personal information.

If one wants to check their credit report here in the states, they can contact any of the three credit bureaus: Equifax, Experian, or TransUnion. Any one of these will provide you with your information according to them, and if you get a report from all three, you get your complete credit history and outlook. All three websites offer a 3 in 1 credit report service, where they contact the other agencies for you and compile a complete report (for a small fee of course).

Not having a preference, I decided to purchase from the first website I checked, which happened to be Experian. I filled out the form online, clicked submit, and was rewarded with an error. "Your information cannot be processed at this time. Please contact us at the following number for more information."

Ok, I thought. I picked up the phone and called. Once on the phone, I confirmed all the information I had entered and the operator attempted to reprocess the request. No good. I was instructed that my information was "too different between the three companies to be reconciled automatically" and that I would have to contact each company individually to gather all of my information. Have a great day. -- Phooey!

Well, what a great opportunity for a parallel test! I clicked over to the next website, Equifax, and filled out their application. I entered all the same information (in a screen that looked much like the previous one) and clicked submit. Behold! I was rewarded with my credit report!

This tells me several things about the companies and the software they use. All of these assumptions are based on the underlying assumption that my specific problem was the result of a defect in the software and not some random malfunction in Experian's hardware or some other failure unrelated to the software. This is all purely speculation based on one test. Take it for what it's worth.
  1. While the services they use looks to be the same service, it must not be, or if it is it must be customized in some way.

  2. Experian could have noticed my problem by running a series of parallel tests using their competitors web services. Apparently, they did not.

  3. Experian's manual process resulted in the same error as their online process, so I assume it is in fact, not a separate process. That seems odd.

  4. Reviewing my credit data between the two companies, there is in fact little to no difference, most of the differences are with TransUnion. This could mean that Experian's process had trouble reconciling TransUnion's data while Equifax did not.

  5. I should also concider that the problem may not be Experian's, but could in fact be Equifax's. If Equifax's web service that provides information is down, then Experian's service would fail, while Equifax's might not (it probably does not use it's own service).

I could probably come up with more, but without knowing more information and not being in a position to actually work on the problem, it's most likely a wasted effort. I do find it interesting however that a simple parallel test exposed this problem. I would have thought a parallel would be the most extensive testing they would execute, as they have two competitors offering the exact same product (which even uses their data).



One has to wonder how things like this get by...