Archive for the ‘Manual Testing’ Category
Good reads that come in 7’s
James Whittaker has recently started a great series of blog posts on the Google Testing Blog – The 7 Plagues of Software Testing.
So far in James’ posts we have encountered The Plague of Aimlessness and The Plague of Repetitiveness.
In The Plague of Aimlessness he asks “Where are the testing spell books? Surely the perilously attained knowledge of our tester forebears is something that we can access in this age of readily available information?” The answer is not. There is a distinct lack of collective knowledge and information sharing within teams. Testers are walking in ever more aimless circles repeatedly suffering ‘the aimless thrashing that we suffered’ already. He urges testers to “Document your successes, scrutinize your failures and make sure you pass on what you learn from this introspection to your colleagues.”
The Plague of Repetitiveness, he argues is caused by just aimlessly ‘doing it’ some more “Developers test but then we retest. We can’t assume anything about what they did so we retest everything. As our product grows in features and bug fixes get applied, we continue our testing. It isn’t long until new tests become old tests and all of them eventually become stale.” He likens it to Boris Beizer’s pesticide paradox. “Pesticide will kill bugs, but spray the same field enough times with the same poison and the remaining bugs will grow immune.”
I’m looking forward to reading more as he posts them.
In a similar vein (we must have been drinking from the same creative juice carton) our very own article – The Seven Deadly Sins of Software Test Automation – is due to be published in the latest edition of T.E.S.T Magazine. In this article we take a light-hearted look at Dante’s Divine Comedy to uncover some home truths about software test automation. We explore each of the seven deadly sins as traps people can fall into because of their earthly vices. I hope you get a chance to read it and would welcome any feedback on it. If you haven’t come across T.E.S.T Magazine already, it is a great read and well worth subscribing.
RIP TestPlan, Long live Qualify
PRODUCT UPDATE – From next month Original Software will be replacing TestPlan, its original test planning tool, with a brand new solution named Qualify. Although the product will not have its official media launch until Autumn, when the additional enhancements will push it into a category of its own, Qualify v.1 will be on sale from April and far outstrips its predecessor in terms of its configurability and graphical interface.
A complete Application Quality Management (AQM) solution that unites requirements, test execution, defect management and reporting within one platform, with integrated communications channels and multi format reporting, Qualify provides all the information needed to make truly informed application quality decisions.
For more information, please visit the Qualify page on our corporate website , which includes a link to the official data sheet pdf.
Stirring Up a 5 Minute Revolution at TestExpo this week
EVENT NEWS – Original Software, the Application Quality Management (AQM) vendor, is hoping to cause a stir at this year’s TestExpo show, held at the Congress Centre in London WC1 this Thursday (the 26th March).
Exhibiting on stand number three, Original Software will be proving it is possible to build and run a fully automated regression test in less than five minutes. Delegates will also have the chance to chat to experts and win a Nintendo Wii by taking part in an interactive game on the stand.
George Wilson, the company’s Operations Director will be speaking in the midday slot of the conference tack, and his presentation ‘The 5 Minute Quality Revolution’ is aimed at making QA professionals re-think their perceptions of automation and show them how they can heighten productivity. “In this live presentation I will not only build a regression test from scratch, but run the test, change the application under test and re-run the regression test again – All in five minutes and without touching a single piece of code! We are calling it the 5 minute quality revolution and in those 5 minutes I’ll make you re-think your views about automation. There is a perception in the industry that automation is complicated and time consuming to achieve – many users have been scarred by solutions that take years to set up with lengthy builds and high maintenance scripts – we want to prove them wrong!”
TestExpo is one of the best UK software testing events and is free to attend for all pre-registered delegates. It includes a full programme of free presentations and lectures given by experts from the top companies in testing. Join us there by registering online at http://www.testexpo.co.uk.
Spreading a little love – some Valentines Day themed goodies for you
Valentine’s Day activities for geeks
Great story on ComputerWorld today – http://www.computerworlduk.com/management/careers-hr/my-career/news-analysis/index.cfm?articleId=2066&email – includes ideas such as technology-related films to watch with your loved one onValentines, twitter love hunt events and more…
Seven reasons to love a developer on Valentine’s Day
Adrian Bridgewater’s blog on ZD Net had me chuckling this morning – I often see that deep concentration akin to the ear-crumb incident he describes…
The St. Valentine’s Day Data Massacre
by George Wilson, Operations Director of Original Software
On the 60th anniversary of the legendary US gangster slaying in Lincoln Park, Chicago, it is perhaps appropriate to use the title of the massacre as an analogy to describe one of the most dangerous but widespread issues currently taking place in QA and testing departments around the globe.
Testing on live data.
Industry estimates suggest that approximately 70% of IT departments admit to using live data during their application testing process. One can understand why. It makes the testing more pragmatic, gives a clearer indication of true quality and allows the application in question to be tested more thoroughly. The problem is that this can expose sensitive data to less than sensitive employees or contractors.
The acquisition of data for testing may breach the Information Security safeguards of your live system which could result in fraud, malicious damage or even legal action if confidentiality is lost.
The simple fact is this: Carrying out testing on live data without the consent of the individual whose data is being processed is illegal.
“But it can’t be. And anyway, we have security procedures in place”. Regrettably that is the attitude of many hard pressed CIOs today. Organizations demand on time delivery of tested software, and live data tends to reveal more ‘real world’ errors or so CIOs have always believed.
But that doesn’t make it legal.
Starting with the stringent Data Protection regulations in the European Community, and spreading worldwide, the law says, very clearly, that the individual whose data record is being used must know the purpose of the processing.
Common sense and good practice say that exposing traceable production data to test disciplines is risky. Despite legal requirements and in some cases severe penalties for breaking them, testing on live data still tends to be common practice. One survey showed 62 percent of companies were using live customer data to test applications and 49 percent shared this data with outsourced testers, with no way of knowing if it was ever compromised*
So, what can be done? Is there a way for QA staff to carry on testing with live data without leaving themselves and their organization at risk? In short, can this ‘data massacre’ be eliminated?
Of course it can. With a clear Test Data Management (TDM) strategy in place you can ensure that you will dodge those data protection bullets, as well as being happy in the knowledge that your entire testing process is underpinned by good quality, legal test data.
Test Data Management is fundamental to the success of your test strategy; after all, data drives the entire testing procedure. With bad data comes poor testing, results you cannot trust, and a whole lot of wasted time, money and effort, and maybe even the odd legal dispute. It pays to get data management right.
Effective test data creation will address issues of disk space, data verification, data confidentiality and protracted test durations. Control of test data ensures that every test starts with a consistent data state, essential in maintaining your data in a predictable state at the end of the test. Checking both visible test results and the database effects is a key principle of Total Testing, a task which is practically impossible to do manually.
This philosophy enables a true regression testing capability to be built and automated, without the need for complex algorithms and checks to make allowances for changing data.
Intelligent data extraction functionality dramatically simplifies the process of creating and extracting data subsets from your live database – with great efficiency and full referential integrity with no program complications. With total control to amend data during extraction, and the ability to extract data from remote sources, data maintenance becomes simple and efficient. Data integrity, for any purpose, is assured.
For more information on Test Data Management, and how Original Software can help you avoid any test data massacres, log on to www.origsoft.com
* The Insecurity of Test Data: The Unseen Crisis, – a Compuware / Ponemon Institute study
The great automation debate
This Worksoft blog caught my attention- ‘Do the Math: Automation is No Longer Optional’ – http://worksoftinc.blogspot.com/2009/01/do-math-automation-is-no-longer.html
The author is trying to put her finger on why it is that people still test manually rather than automate.
Mcfinder’s comments on the topic have started up an interesting debate on this – “The real problem for me however is maintenance of these scripts. We live in a fast paced ever changing world, every time the application under test changes the scripts that were written to originally test the application have to be re-written to accommodate the changes. This is a time consuming task and can often take longer than manually testing the application in the first place. This high maintenance burden is one of the big reasons why so much automation software is just gathering dust on shelves. As soon as the application changes the tools are useless, unless significant time and resource is thrown at them.”
I think he has a point here – It is the principal reason why our products are built with code free and self healing script technology. I wonder how many others agree. Is this the biggest hurdle to automation or are the other factors – undefined test process / unrealistic expectations bigger issues?
The Early Bird Catches The Worm
In a recent survey we commissioned, it emerged that over 40% of CIOs admitted little or no corporate regard for the quality of their software. The reasons for this have been widely debated since we announced the results and there have been some really interesting points raised – one of which being that the reason there is CIO apathy towards quality is because many are working to short-term goals.
“They will get hammered if they go over budget or miss a delivery date, but poor quality can usually be patched and covered up long enough for them to be promoted and reassigned away from the problem.” said one commentator.
There have been numerous reports in the IT press indicating we spend 70% of our total development budget maintaining existing applications. From what I see, increasing the quality of the software would decrease the cost of maintaining it. Surely, that would be of interest to a CIO?
The short term goals on cost savings by CIO would only lead long term revenue leakages due to rework, production bombings, performance issues etc. In short a Visionary CIO would never compromise on quality…!
There was a nice article on ZDNet about this issue a while back, with a great diagram showing the escalated costs of finding a bug throughout the software development lifecycle.
We all know that the earlier bugs are picked up in the application development lifecycle, the cheaper they are to fix. It is an undisputed fact. A recent article in Computer Weekly Magazine stated that the cost of poor quality “in the US alone is estimated to be upwards of $75bn a year in re-work costs and abandoned systems”. It noted that “Re-work can account for up to forty percent of the project cost, so the earlier defect problems are identified, recorded and rectified, the lower the cost”
So there you have it, ensuring you build quality into your application developments should be a no-brainer. Unfortunately your CIO might not agree with you.
« Newer Posts
did you break your toys?
Did you break your toys?
I came across this great forum post today in the Software Testing Club – http://www.softwaretestingclub.com/forum/topics/did-you-break-your-toys – about how when the writer was a young boy he would take apart his toys to see how they worked. Curiosity is often cited as one of the main traits of a tester and apparently we all do this.
It reminded me of the time as a toddler that I got hold of the mantle piece clock and took it apart piece by piece. I probably wouldn’t have managed to put it back together again anyway – even if my brother hadn’t eaten half the small clockwork parts – I was never forgiven for that!
So how many people had destroyed their Christmas presents before Boxing Day?