Posts in Documentation
Reviewing architecture documents
After a grueling round of reviewing some long architecture documents for a new project, I've noticed that I have some required tools and habits I've developed for a first review:

  • Different colored pens and highlighters (where different colors indicate different things like follow-up, possible issue, quality criteria, testing feature, etc...)

  • Post-It tabs are a requirement for any document over 20 pages (where different colors again indicate different things to follow up on)

  • A fresh notebook for the project for notes, ad-hoc drawings, and questions

  • Google - to research abbreviations, terms, and technologies I don't know

  • A photocopier for diagrams that I want to have handy (but don't want to remove from the document without replacing with a copy)


Once completed with an initial review, I'll try to follow up on some of the research tasks (learning new tools, technologies, etc...) right away while it's fresh in my head. Then I'll draft emails full of questions off to various people in the project - sometimes sharing some of my sketches to make sure I understood things correctly.
Performance degradation curve - page five
A little over a year ago I met with another tester in town to provide advice on performance testing. I recently found the notes from the meeting in an old notebook. In the five pages of handwritten notes, I documented my understanding of the problem and how I would approach the project. Take these notes for what they are - sketches over the course of an hour or two as I asked questions and clarified the project. However, I thought they would be interesting to share.

The fifth page was a description of a performance degradation curve and how to use it.

Performance degradation curve

As we discussed how to translate the results into a simple chart, I provided an overview of a performance degredation curve. We talked about what it might look like to graph capacity and/or licensing.
Formating the report - page four
A little over a year ago I met with another tester in town to provide advice on performance testing. I recently found the notes from the meeting in an old notebook. In the five pages of handwritten notes, I documented my understanding of the problem and how I would approach the project. Take these notes for what they are - sketches over the course of an hour or two as I asked questions and clarified the project. However, I thought they would be interesting to share.

The fourth page was an attempt to illustrate how I would approach organizing my work and how to present results to the various stakeholders.

Laying out results

I tried to paint a picture of an Excel workbook with different worksheets detailing out various information about the project. We discussed a tab for key project information, a tab for an overview of the architecture, a tab for the usage model, and then some number of tabs for the actual test data, test results, and charts. As we talked, I tried to give an idea of where the information might come from and what it might look like.
Developing a usage model - page three
A little over a year ago I met with another tester in town to provide advice on performance testing. I recently found the notes from the meeting in an old notebook. In the five pages of handwritten notes, I documented my understanding of the problem and how I would approach the project. Take these notes for what they are - sketches over the course of an hour or two as I asked questions and clarified the project. However, I thought they would be interesting to share.

The third page was an attempt to document the initial usage model.
UCML of the workload

The notation is the UCML, which stands for User Communty Modeling Language. You'll see four types of users: clients (aka "users"), administrators, customers (of the clients), and system tasks. From there I attempted to capture tasks of interest and some rough percentages for each task across the user population.