Wednesday, September 19, 2012

Does anyone actually use Quick Test Pro etc.?

Here is an observation which I’ve been been playing back to clients for a year or so now:


  • There are two types of company in the world, those who talk about automated testing and those who do automated testing.
  • Those who talk about automated testing buy tools like Quick Test Pro, WinRunner and Quality Centre and Rational Test Workbench, Rational Quality Manager, i.e. expensive products from the likes of HP and IBM.
  • Those who do automated testing use tools like Selenium, Fit & Fitnesse, Cucumber (inc. Gerkin, RSpec, etc.) and other Open Source products they don’t need to buy
For completeness I should add:
  • There are a few, very few, companies which don’t talk about automated testing but on the whole everyone thinks it is a good idea and those who don’t do it would like to
  • There are a some companies who talk about automated testing and have no tools; this is a better position to be in than talking and spending money
This is my observation, and if I’m being honest I did see a company 12 years ago which used WinRunner in a load testing environment. But, on the whole the software test suites from IBM and HP tend to be shelfware. In fact, this blog is motivated by a trip to a company which found my observation very funny because QTP is sitting on a shelf unused.

What I would really like to know is: has anyone else seen this? Or, can anyone give a counter example?



I think there is even a logic here. The IBM and HP products are expensive, so expensive you have to ask the price (actually IBM does give the price of Test Workbench at $5,500 for a single user license). Consequently they need to be sold, it also means that they need to be bought at a high level in the corporation. The people who are sold these products are very disconnected from the day-to-day work of developers and testers. This means the products might not be suitable and even if they are then developers and testers need to adopt them: they aren’t part of the change decision.



In addition: I just don’t think these tools are any good. To be fair, I’m no expert on testing tools but I’ve seen Quality Centre in action. Quality Centre (QC) isn’t an automated testing tool, its a very expensive test tracker. Testers write their test scripts in natural language, e.g. English, into the tool, the tool runs on the side and as they execute the tests they click Success or Fail. At the end QC gives a report.



To my mind QC is a fancy version of Microsoft Word. English language test scripts, if they are any good, are so detailed that they take almost as long to create as automatic scripts but they are expensive to execute because they are manual.



QTP and similar products are often linked to the OS or browser. They result in fragile tests which break easily when a box moves a few pixels or the browser changes.



HP also follow a razor blades model, I’m told: once you’ve bought QTP you need to buy a plugin for every browser to OS you want to use it with. Every time a new version of the browser comes out you need to buy a new plugin - or so I am told, tell me if I’m wrong.



The net result is: these tools are high maintenance and easily fall into disuse if they are ever used.



On the contrary: Open Source tools are adopted by people who do the work because they see the benefit they bring and they don’t need budget or signatures to get the tools. People get them because they want to use them not because a Salesman comes around and convinces the boss that you need a tool.



And because many of the Open Source tools require developers to create glue code to interface the tool to the application they lead to a more collaborative style of working.



(Perhaps some companies who just talk about automated testing have downloaded some of the Open Source tools but never use them tool. Since they are free nobody notices.)



So there is my theory. Anyone else seen this? Anyone else explain it? Anyone got a counter example?


Wednesday, September 12, 2012

Success in Agile

Last week I made the London to Cornwall train journey for the first time in a few months. I made the journey many times during the course of the Agile Cornwall programme. This time however I was on my way to the Agile on the Beach 2012 conference. The conference itself is a product of the Agile Programme and looks set to run again in 2013 (although the final decision has yet to be taken.)



With the release of a report last week which claimed the Agile Cornwall programme created 50 jobs I felt Agile on the Beach is something of a celebration. I’ve now had a chance to read the report in full and while the report is not available for public download I would like to share a few comments from it.



First I should say that the report itself was conducted by a third party company commissioned by Oxford Innovation. Oxford Innovation runs the Grow Cornwall programme on behalf of public bodies and it was as part of this programme that Agile Cornwall was bored.



The report surveyed the companies who participated in the Agile programme to find out about their opinions of the programme and the changes which had resulted. They spoke to multiple people in most of the companies so it was pretty well grounded although it probably wouldn’t stand up to rigorous academic analysis. (If any academic out there would like to do a rigorous study then please get in touch.



Here are some highlights from the report:



  • The Agile programme has met expectations for all companies who participated in the review. (I suspect one which was disappointed did not participate.)
  • Businesses are finding they are more flexible and responsive to customers
  • Money and customers
    • None of the business reported a decrease in turnover, profitability or customers since adopting Agile.
    • 7 companies reported increased turnover or revenue; 3 reporting the increase was significant
    • 6 reported increased profitability, 3 saying the increase was “significant”
    • 6 reported an increase in customer and/or business opportunities with 3 of the six saying the increase was significant
There are some downsides however:
  • One company reported that it was now more difficult to hire because they could not find people with Agile skills.
  • Small companies said it was difficult to find the time to get started on Agile
All of which makes me very happy.

There are also some very positive comments on the quality of the consultants used in the project, as I reported in my last blog it wasn’t all me. But, if I may be permitted to blow my own trumpet, I was there a lot, I worked with more companies more of the time than any other consultant. So I’ll be giving myself a pat on the back!



The report has, unfortunately, not been made public - although a summary version was given to attendees at Agile on the Beach. So if you would like to see the full report I suggest you go and knock on the door at Grown Cornwall / Oxford Innovation.