Monday, 14 July 2014

Shall we dispense with the formalities

A number of my friends have recently decided to go it alone and #startup their own companies. None of them are quite big enough to warrant a dedicated tester just yet, but they're still conscious that quality is important. So now and again I get emails like this:
Yo yo yo! Can I draw on your testing expertise and ask any recommended reading materials / quick courses to get familiar with formal testing documentation? I was looking at various Youtube videos of varying quality around IEEE 829 documentation but thought there may be a more focused approach.
With the uptake of methodologies like agile and techniques like Rapid Testing, the practice of scripting massive detailed test plans is thankfully dieing out. The focus has shifted onto clarifying requirements so they're easier to to meet and verify, and producing clear notes and bug reports as you go exploratory-ing off around the edges.

If your testers are spending all day writing test plans with steps to tick off as they go, that's time they could be spending ACTUALLY testing. Those documents don't really add value to your product and it suggests you don't trust your tester if they haven't got a 99 page report detailing exactly what they have done.

So here's the response I threw together. 
I'm afraid I don't really do 'formal' documentation. It's often based quite heavily on your spec and requirements docs, should you be lucky enough to have any! We certainly don't have any formal test plans or report templates at Technophobia. I've got an example of a test plan that we had to use at [redacted] and it's shit! Test documentation can really be anything from a word doc, to a checklist or a mind map ( is pretty cool for train-of-thought kind of stuff). It's basically just a list of things that you have checked. (There's an ongoing debate in the test community about the difference between 'testing' and 'checking' which I'm getting rather sick of and I'm not going to touch here!)
Obviously functional requirements should be pretty well covered in plenty of detail, but then don't forget you also have stuff like security, performance, platform, cross-browser, etc.... That may or may not all be relevant depending on what you're delivering. The big thing that people don't realise is that testers also have a duty to make sure software DOESN'T do things that AREN'T in the spec! The tricky case of validating that something doesn't NOT work! 
Don't get too drawn in by test certification either. There's a also massive debate in the test community at the moment over whether it really means anything. The main Test certification company is ISEB - but it's mostly an exercise in seeing if you can remember what you read in a book and making them money! (I'm not just saying that... I'm ISEB certified, but haven't drawn on any of it in years. Nor can I remember much of it either!)
Yes, I realise it's vague and doesn't cover everything. But I reckon it's a good starting point for a such a quickly rattled off Facebook message. And like a good tester should, I hope to have prompted further questions and discussions! If you want any more detail then you'll need to find yourself a tester and barter with beer/burritos/money in exchange for the contents of their brain...

No comments:

Post a Comment