Team System 2010 and the Much Maligned Manual Test
Automate, automate, automate.
Why are you running manual tests? Automate them. If you are running manual tests, that is the smell of technical debt. Technical debt, I say! Automate them.
If you can’t automate your tests, that is the smell of tightly coupled design and a failure to separate concerns in your application. Automate.
And then there is the other reality.
Sometimes it isn’t a technical barrier to automation. Sometimes there are real reasons to perform manual testing. Here are some:
- Ad-hoc testing finds bugs. Typically, automated testing is going to test the happy path.
- The test is actually a test of the UI. Yes, sometimes we simply want to assure that the button is green. Or red. Or whatever.
- There is non-deterministic behavior in the application that we want to play with. Think games.
- There is a marriage of content + technology that requires human decision making. Think search. Does it make sense that the first hit goes to that site?
- Sometimes it would just cost a lot to automate it. Let’s face it.
- Sometimes we have more people to run tests than people automate tests. Let’s face it. (OK, I don’t like this either.)
- You get the idea.
The Geriatric Test Pool
At a previous employer, we made a web site that was huge (HUGE) and the content changed significantly once per quarter. We tested not only for technical additions and features to the site, but also for content changes.
- If a logo changed, how did that affect layout? On a million pages.
- If we introduced a new content column width, how many tables got smashed up too narrow to read? On a million pages.
So, what did we do? We brought people’s parents in to the building to click through the site and report anything weird they found. This still seems to me to be a perfectly natural and reasonable use of manual testing resources.
When you can’t vote them off the island
And here’s one last reality check: Although people talk a lot about automating tests, my experience is that only a few shops are getting it done. Yes, automation is how things should be. Huge piles of manual test cases are often how things really are. This post will not even try to justify this. It is bad. Bad, I say! Automate!
Since manual tests are a simple fact of life, how can we make it easier? Well, Excel will get you a long way. Don’t get me wrong, lots of people spend their entire careers following the click sequence outlined by Joe test lead in worksheet 17 of spreadsheet 32. Yuck, kill me.
This is a classic recipe for a developer to respond, "Works on may machine." But, hey. If it works for you?
Alternatively, Team System 2010 will finally embrace the masochistic manual tester with both arms open. Camano is the project name for a new offering we’ll see in VSTS 2010.
Camano is a manual test management/authoring/runner tool implemented in WPF (note the black sweetness and skin-ability). Test cases are expressed as work items, and the Camano application lets testers run the cases and record the results.
Not only can the testers run the tests, but they can record the results in the form of a video (screen capture) or a stack trace from the system under test. This is a far cry from hitting the button of Pass/Fail that we get in VSTS 2008 and represents a whole new way of ensuring no one says, "Runs fine on my machine."
Viva la Manual Test!
We now have an absolutely killer tool for
doing what we should try to avoid manual testing. Rock on.