Arkisto: March 2011

Not Some Useless Expense

28. Marchta, 2011 | Kirjoittaja: Antti Niittyviita

Have you ever made a reclamation about a product or service you received? Have you ever taken the product to warranty service? This is how it looks like from the other side of the counter.

  1. Customer service listens to your complaint and tries to help you: 10 min
  2. Technical customer service tries to fix the error with you: 10 min
  3. Finally, the technical customer service writes an error report: 10 min
  4. Support representative checks the report, tests and sends it to the product development team: 1 hour
  5. Product development team investigates, fixes and tests: 10 worker days (70 hours)
  6. The fix is delivered to the current customers, if possible. 2 worker days (15 hours)
  7. Send feedback to the customer who made the complaint, through the entire chain: 30 min

When one customer complains, that is not when people get around to making repairs. When 100 customers complain, the cause of the complaint must be quite obvious. Entire complaint cluster takes around 2 hours per one complaint. When there are 100 people with complaints, it takes 200 hours. Fixing and delivering the fix eat another 85 hours.

So, a total of 285 hours! In worker days, that makes 38. With a very modest 250 euro per day, the entire show adds up to 13300 euro! And that sum does not even contain the loss of image and bad user experiences caused in the customer base by the flawed product! In addition, delivering the fix can in reality take over 100 times more time, if it requires the product to be pulled back.

The mission of the testing is to eliminate these expenses already during the product development, before the product has yet made its way to its first customer.

It is entirely useless to try and convince that testing is only a useless expense.

Toast Makes the World a Better Place

20. Marchta, 2011 | Kirjoittaja: Antti Niittyviita

When I woke up this morning, I felt good. The summery morning sun warmed nicely through the window blinds, the birds were singing outside and the aroma of morning coffee filled up the house. The traditional Sunday morning toast leapt out of the toaster with an enthusiastic thunk. It was the time for breakfast.

Right at the first steps of the breakfast something felt wrong. The ‘feel’ of my traditional toast was somehow mild in my mouth. It was crispy, but the taste of it was missing something essential. As I bit down on the second third of it, I finally understood that it was about salt. Or rather, that it was missing.

I boldly decided to continue my bread project to the last third, and that was when it started to happen. I recognized the sharp salty taste, which had so far been missing. The taste reached me slowly, but soon exceeded all expectations. All the salt reserved for this toast by the baker had stuck to that last third. And indeed, it did taste bad! There were moments when I had to stop to slake the sensation of burning thirst!

Software development is like that toast. If everyone understood it, most of the projects could maintain the schedule. So, what is it about?

Testing is the salt of the software development. Like my baker, a confusingly large portion of software projects still assign the entire testing effort to the last third of the project. When that happens, the development cycle loses its edge from the start, including its feedback mechanism. On the other hand, numbing the project’s end with testing makes you discover a huge number of nasty surprises. Which is why time is also consumed by running after the problem points, which in turn leads into failing to meet the deadlines.

The best result is achieved when you mix the ingredients right at the beginning!

Straight Jacket or Hawaiian Shirt?

10. Marchta, 2011 | Kirjoittaja: Antti Niittyviita

A huge responsibility was given to Teppo the Tester (name changed). Teppo had to prepare a test report of the most recent test cycle. Having done it plenty of times, beads of sweat start emerging on Teppo’s brow even before reaching the end of the cycle. Finally, the time for reporting comes. With shaking hands, Teppo executes the painfully long and frustrating process:

  1. Teppo exports the accurate data of the test cycle from the very expensively acquired test control system by making use of an Excel-macro (which he had made himself with blood and tears)
  2. From the mess produced by the Excel-macro, Teppo isolates the essential information – pictures and diagrams – and pastes them to the e-mail he will eventually send. Most of the pictures and diagrams require precise re-scaling after being pasted so that the e-mail would stay readable.
  3. Managers also want to see an accurate listing of test cases, so Teppo has to return to Excel-tables. Because it never works as intended when it comes to pasting Excel-rows to the mailer program, Teppo needs to open a notepad, through which he cycles the rows to the mailer program, so that they stay readable.
  4. Sweating profusely, Teppo adds a short description of the test cycle’s events, issues and their solutions to the beginning of the e-mail.
  5. Finally, Teppo pastes the Excel-table hell exported from the very expensively acquired test control system as a whole. The Excel-table hell contains all the same information that Teppo had just now compiled to a more readable form with great painstaking effort. Teppo does his best to understand that opening an Excel-file and reading information from it can be very difficult for many manager-level people.
  6. Teppo adds to the recipient line of the e-mail a varying number of addresses and a couple of e-mail lists as well.
  7. Frustrated and on the brink of tears, Teppo, who is a big man and once upon a time, a proud professional Tester, presses the Send-button. Will anyone end up reading this report? The e-mail gets only answered by Testmanager Urpo, who announces that the next report has to have a bit more of this and that and it has to be in a powerpoint slideshow format! At the end of his answer there looms a question, which drops the base of all that is ethically right with testing: Why does this report have so much red!? As if Teppo is being blamed for it…
  8. Totally spent, Teppo prepares himself mentally for the next test cycle.

Does this sound familiar? The end result is not enough for making a good test report in a modern cost efficient business world, but equally important is also the way the report is made. Frustratingly many of the tools are still not born for the need of the end user, that is, the testing expert’s needs. The user is the one who has to adapt inside the straight jacket tightened by the tool.

Is it too much to ask that the tool offers a direct way of compiling such a report that it includes all the demanded bits by an individual user? Is it too much to ask, that at all times, no matter who it is that uses the tool, they can see the phases of the project’s advancement and the most significant things associated with its development? Would it not be a darn utopia when all that mentioned above could happen with a single click of a mouse!

Easy to use, simple and time-saving tool is the tool of the future. People, who have no knowledge of better, have become set deep in their tracks and fail to see the advantages of newer tools. If you have used a chastity belt and a straight jacket for your entire life then Hawaiian shirt and cool shorts (commando-style) might feel weird.

Yet, after short familiarization period, the constraining gear gets stomped deep into the swamp. It is only a matter of time when the rigid tools created by large firms for other large firms give room to the actually useful tools.

Are you content at sitting in a soft room wearing a straight jacket, drinking through a straw magical drinks offered to you by men in white, or would you rather sit in a sunny beach wearing a Hawaiian shirt drinking piña colada while engaged in a dialogue with nice friends?

PS. We recommend familiarizing yourself with a Finnish-based group tool called FlowDock. It has been made with the right kind of attitude while remembering who the end user is.

Checklist-based Exploratory Testing

2. Marchta, 2011 | Kirjoittaja: Antti Niittyviita

Along the years, we have complained loudly about issues relating to testing. We have complained about bad tools, planning test cases, stubborn business culture, doing the wrong things and inefficiency. We have had plenty to complain about, but now we decided to change the nature of testing in one fell swoop. We decided to mark those things that most testers unwittingly already do in their line of work.

We still encounter so-called super great test specs whenever we move about in the world. They have spent a lot of work time to plan the tests, and it is sure to feel good for the one who pays the bills. What is shared with approximately 80% of these legacy projects are absolutely awful test cases. Usually, the test cases tell, in detail, HOW the test is executed. That’s what we call nitpicking (to stay within the boundaries of political correctness).

Of course, I am not saying that nitpicking is at all boring. But, when we are conducting business, nitpicking is diverting effort and attention to totally wrong things.

The primary goal of a test case is to tell you WHAT has to be tested! You can relate that information with but a few sentences. In fact, Twitter has proven that most things can be said with only 140 characters!

There are plenty of reasons given to support using test control systems and documenting testing. Yet, how would we go about combining fast and goal-centered test planning, exploratory testing and adequate level of documentation for a report?

We have discovered exploratory testing, when supported by checklists, to be a very functional solution. How it works is that test specs are made as a simple checklist about things which at the very least have to be tested by the end of a session. Those aforementioned 140 characters should be more than enough to give sufficiently informative headlines that inform a rational tester about what the actual goals are. We have thrown away test case’s steps and accompanying junk. Usually tackling those only consumes time, which translates into money.

You can use a checklist from the test control system, as well as write the results into it, too. Even Excel functions fairly easily for this, if you just remember a minimalistic approach.

  • Planning tests and specs are super fast
  • Running tests does not cause tunnel-vision, but is exploratory instead
  • You discover significantly more defects per buck
  • Even the boss likes it when you can produce the stats and reports
  • Checklists are also a suitable tool to be used by, for example, a commissioning engineer

You can also stop and think about this from the perspective of a professional tester. You can notice that they are already working with similar methods. The only problem is that they often do it all while constrained by a stubborn, halted, and a change-resistant process. And that is why the result does not always please.

You gain more bang for your buck when you have the daring to let your testing professional do their work the way they should.