Archive for the ‘Software Testing Nightmares’ Category
Obamacare Website Malfunction – the Cost of Not Testing
If there is anything that President Obama has learned in the last month, it’s the importance of software testing. The Healthcare.gov website, an insurance shopping site for the US government’s flagship health policy “Obamacare”, has repeatedly been dogged with problems.
Glitches within the site have caused misery for tens of thousands of would-be insurance policy purchasers, with many enduring lengthy waiting times. The site has also had a number of complete outages. This has proved massively frustrating for the millions of Americans who want cover from January – they have to sign up by December 15th – meaning many of them have to resort to apply using more traditional methods: by phone; post; or in person.
Congress is trying to unpick why the site degenerated into chaos following its launch at the start of October. It’s all turned into a bit of a blame game, with contractors blaming government officials who are pointing the finger at the contractors. Meanwhile the Republicans are rubbing their hands with glee.
The key reason came to light last week when contractors admitted that the health department did not fully test the site until late September, the week before go live. So, unfortunately for Obama, this goes to serve as a lesson for every customer-facing organization in the world, the importance of testing and the cost to business, reputationally or otherwise, when technical defects cause havoc.
It’s easy to point the finger at testing – or a lack of – now that it’s been publically admitted. And leaving the testing phase of the project until the very last minute always meant that there would be serious problems. But it is likely that failed testing was only part of the problem. The Government was accused of adding changes at the last minute, constantly changing the scope of the project so it was difficult to execute the project against these very fluid objectives. Nonetheless, it’s perfectly possible to test effectively in such a fluid environment, and is increasingly becoming the norm.
It would also appear that collaboration and communication between the contractors and the Government officials wasn’t as harmonious as it should have been, probably a result of everyone having to race to meet such a tight deadline. The lesson for organizations embarking on an initiative like this is to plan, properly scope requirements, give realistic timeframes for project completion, provide visibility to stakeholders, and finally, test, test and test again. Sending websites, systems and applications to go live with multiple defects is a recipe for disaster, which has been a painful lesson for Obama and the Democratic Party in recent weeks.
Wal-Mart website glitch offered up electronic bargain for shoppers
US consumers took to social media en masse last week to broadcast an astounding deal they found on Walmart.com. A glitch on the retail giant’s e-commerce site saw a 24 inch high-definition Viewsonic computer monitor, an InFocus IN2124 digital projector and other products on sale for as little as $8.85, when their retail value is usually in excess of $500.
A spokesman for Wal-Mart – according to the Good Morning America site – said the issue had now been resolved and its IT teams are scanning the millions of items on sale on its site to check there aren’t further technical errors causing price discrepancies.
There was speculation that the site had been hacked, but Wal-Mart denied this and blamed a technical defect. And this isn’t the first time – Wal-Mart suffered another problem recently where a glitch in its food stamps system enabled shoppers to load up their online shopping carts with hundreds of dollars of free items.
It involves a bit of wild speculation to figure out why this happened and how these shoppers managed to get such a bargain. But it is clear that the validation of the data used in the website wasn’t as tight as it should be. No doubt there will be an intensive investigation to identify the cause, but these things are not always IT problems.
Website validation is a real problem for retailers and their e-commerce sites. For example, a software upgrade or patch to a system can cause anomalies within a website and not necessarily to the section that has been changed. One change of code, or even data messed up in a product manager’s spreadsheet, could have repercussions in seemingly unaffected areas of the site. In Wal-Mart’s case, it was their pricing structure.
So how realistic is it for retailers to validate every part of their site every time a change happens? IT teams often make a call on how extensive regression testing should be – but resources dictate that it’s impossible for everything to be tested. Once a system is live, the emphasis shifts to the business users who are responsible for the data – all too often these groups are left as technology paupers, without the benefit of the automated testing solutions their technical colleagues use. But there are strategies that can help e-commerce providers like Wal-Mart. Automated testing and validation solutions aimed at maintaining ‘business as usual’ can run thorough content checking after every update flagging up the glitches of all types immediately – this means that when retailers press the button on changes, patches, or upgrades, they can go live with more confidence.
Wal-Mart has apparently refused to honor their customers’ discount purchases, trying to appease them with $10 vouchers instead. Cold comfort for those who thought they had got the bargain of the century and means this hitch has had a negative impact on Wal-Mart’s reputation. Of course, this isn’t the first time a technical fault has damaged a company’s reputation, but to protect their name and their customer satisfaction levels, it’s vital that retailers – as far as they can – ensure e-commerce websites are as defect free as possible.
Windows 8.1 update glitch stops RT starting up
Microsoft Surface RT gadget owners were left frustrated this week as the Windows 8.1 upgrade stopped their devices working properly. Customers also complained that the 8.1 update for Internet Explorer didn’t work with Outlook and other Google services.
Microsoft put the onus on Google, saying that changes it had made to its search function were responsible. But Microsoft has since back tracked and made changes of its own to fix the problem. It has also doled out advice on how to get IE11 working with web-accessible versions of its Outlook email program, which failed to work with the new version of the browser. Microsoft has withdrawn the update from its website whilst it investigates the issue.
This is another example in a long line where companies compromise the quality of their products by failing to conduct the appropriate level of testing. Why this is could be for a number of reasons.
One could be that Microsoft has fallen over itself to respond to market demand for new features and functionality and in doing so has not properly tested the new elements. For example, the “home” button has made a re-appearance in version 8.1, being absent from version 8. A failure to test this new component across the appropriate range of mobile devices could have caused the problems.
Microsoft’s haste to get this new version of its operating system to market might also have caused them to cut corners in the testing process. The impatience in the race to market and get an edge on the competition can all too often mean companies compromise the quality of their product.
Another potential issue is the whole mobile environment. The fact that companies now have to test the efficacy of software developments or upgrades across mobile phones, lap tops and tablets means the margin for error increases exponentially as mobile adds further layers of complexity to the testing of these products.
Whatever the reason behind the Microsoft 8.1 failings, the fact remains that there were serious failings in the testing strategy. The knock-on effect was that version 8.1 was released with technical defects, which in turn angered and inconvenienced customers who took to social media to voice their criticism.
Was the negative fall-out worth the corner cutting? We’d say that it wasn’t. Companies compromise quality at their peril.
5 Reasons why ignoring Software Quality could be your downfall
Software failures are costing companies significant amounts of money and damage to their brand, people are losing jobs and in some cases their liberty because of avoidable software failures!
Read our “5 Reasons Why Ignoring Software Quality Could Be Your Downfall2″
Times are tough. You know it, I know it, your competition knows it and your boss definitely knows it. With greater competition, more consumer choice and a greater dependency on technology than ever before, every organisation faces the challenge of building quality IS systems to support rapidly evolving business requirements. New applications need to get out the door quicker than ever, and you know that the new application upgrade will be hot on its heels. Time is money and IS departments are constantly looking for ways to cut down their application time to market. On average 40% of total application delivery time taken up with testing, so is it no wonder that this is an area marked out for special attention.
Why does testing take so long? Is the question often thrown at the QA department. Well it’s a highly manual process — even though some departments may use automation tools, it is still a time consuming activity. There is a skill in balancing the required resources, with time pressures, and software quality. Original Software refers to this as the Quality Conundrum, and it is something that IT management need to get to grips with. And fast.
You want better quality software? Right, well it will take more time to get right. You want the application live next week? OK, but we will either need to double the team of testers or we will only complete 75% of testing and the quality will be compromised. We don’t have any more budget for more resources? Well there is a decision to be made between an on-time application or a high quality application—you can’t have both.
A familiar conversation? This is the Software Quality Conundrum!
Ignoring the issues surrounding software quality won’t make it go away. Understanding corporate attitudes and distinct advantages to getting software quality right and the problems that can arise if things go wrong, is an important first step in the right direction, and to help you on this journey, read our “5 Reasons Why Ignoring Software Quality Could Be Your Downfall“.
Top 10 Software Failures of 2011
We came across an interesting blog this week. Its content gives the folks at Original Software a multiple number of examples that make it to our “Software Testing Hall of Shame” as 2011 draws to a close.
Phil Codd, Managing Director at SQS, lists ten software failures, as voted by Consultants. We think it’s great as it highlights the damaging effects of inadequate software testing and the ramifications on the company at fault.
So go on folks, have a read and make sure you make application quality a number 1 priority for your application delivery team in 2012! “Top Ten Software Failures of 2011″
Have a happy holiday season everyone and may 2012 be risk free and healthy for both you and your software.
Microsoft Joins Testing Hall of Shame
Microsoft shot themselves in the foot with this embarrassing blunder that has definitely qualified for a spot in our Software Testing Hall of Shame!
A Microsoft antivirus definition file was the cause behind Google Chrome’s browser to disappear from users’ PCs!
Scores of users reported the deletion of the Google Chrome browser from their PC after running Microsoft’s Security Essentials. Google told Chrome users that Microsoft incorrectly marked the browser as malware and it was later reported that 3,000 users were effected.
Can you blame Google for being a little cheesed off at their Microsoft Internet Explorer (IE) rival. It has been alleged that Chrome will pass Mozilla’s Firefox as the second-most-popular browser by the end of this year, pitting Google and Microsoft for the top spot.
As reported in ‘ComputerWorldUK’ by Gregg Keizer: “An incorrect detection for PWS:Win32/Zbot was identified and as a result, Google Chrome was inadvertently blocked and in some cases removed from customers PCs,” Microsoft said in a statement posted to the Facebook page of its malware research center. “We have already fixed the issue…, but approximately 3,000 customers were impacted.”
So it seems the moral of this story once again is that inadequate testing and quality checks lead to issues.
I wonder what tools Microsoft were using to assist them? I would recommend they look here: http://www.origsoft.com/manual-testing-is-here-to-stay/
Apple Joins Testing Hall of Shame
Another organisation has made it to our Software Testing Hall of Shame this week and what an embarrassment to such a high profile company.
According to an article by ITPRO, hackers are taking a big bite out of Apple’s data security.
A website known as “JailBreakMe” has been exploiting Apple’s iOS flaw by allowing users to upload software onto Apple devices that has not been approved.
Worse still, Apple users can potentially have malware downloaded onto their devices.
Affected devices include the iPhone 3GS, iPhone 4, iPad, iPad 2 and the iPod Touch running iOS up to version 4.3.3. The soon-to-be-released iOS 5 could be affected too, once it is released in the Autumn.
The German Federal Office for Information Security, as reported by the Guardian, has deemed the situation serious enough to put out a warning on the “critical weaknesses.”
Tom Brewster, ITPRO, states: “Hacker group Dev-Team released the new jailbreak service this week, allowing users to hack their devices solely by following steps online through the mobile Safari browser.”
Apple could have been spared the embarrassment and expense with more efficient and thorough penetration testing of their systems.
Application quality should be at the heart of any development project. It just goes to show, even the big high-tec shops still get it wrong!
Tesco Bank Should Have Tested on IE9
The news on CIO’s website of Tesco Bank having to issue emergency guidelines for Internet Explorer 9 users, made me chuckle.
Either testing on multiple browsers wasn’t carried out in full or the testing technology being used didn’t support IE9. Either way, Tesco Bank will be entering our ‘Software Testing Hall of Shame’ this week.
Perhaps the bank wasn’t aware that Original Software had announced its support for IE9 back in March: Original Software in Pole Position to Support Internet Explorer 9.
With Original Software you can discover an easier way to ensure your web applications are tested on all browsers.
The Cost of Software Failure
Our Software Testing Hall of Shame hosts a gallery of software glitches all serving to illustrate the importance of testing software. The latest addition comes from the London Stock Exchange where, billions of pounds worth of share trades were lost after the London Stock Exchange’s main trading system ground to a halt shortly after the market opened last month.
Dealers were left twiddling their thumbs and angry after the LSE officially called a halt to all trading on its electronic order-driven system.
The London Stock Exchange (LSE) has been the focus of much unwanted attention over the past few months, following problems with its new trading platform, MillenniumIT, that caused irregularities appearing on traders’ screens and trading downtime.
Could these problems have been avoided with more thorough QA procedures? Our guess is yes. This is a great example of how business risk is heightened when systems are launched too quickly, not allowing enough time for testing. But then again, the 15-month system replacement has had a catalogue of problems from the start.
What do you think? What can be done in the future to prevent this sort of thing happening again?
Older Posts »
HMRC Software Glitch Makes Returns Taxing
Yesterday, January 31st, was the last chance for self-assessors to submit their tax return online at the website for Her Majesty’s Revenue and Customs (HMRC). Even though the HMRC said only on Friday that their computer system was working well and they did not foresee the problems experienced in recent years, our majesty’s government failed to test thoroughly again!!
Apparently, accountants were having problems after it appeared that the company that supplies software which allows accountancy firms and advisors to access the HMRC website had shut its links early. Returns that should have taken no more than 10 minutes to file, were taking up to an hour. HMRC confirmed that some advisors using third party software were experiencing what they called a ‘slowdown’.
CCH Personal Tax was the software concerned and although its parent company, Wolters Kluwer, said they had not been notified of any issues with the software, problems certainly existed.
Now we can’t be sure what the problems really were, but this story is yet another example of poor or incomplete testing. Such high profile software glitches only go to justify Original Software’s mantra that “application quality must become a business imperative” – especially if government reputations are to remain unscathed.
Source: Yahoo News