Blog

Archive for the ‘Software Testing Nightmares’ Category

Insurance & Technology: How Insurers Can Avoid “Black Swans” in Product Launches

Insurance & Technology, a US publication that “provides insurance business and technology executives with the targeted information and analysis they need to be more profitable, productive and competitive,” recently published a contributed article by Original Software CEO Colin Armitage.
Black Swan
“We’ve all seen it happen: An IT project plagued with delays, changes and complications goes so far off the rails that it becomes a liability,” wrote Colin, describing projects such as new product launches that exceed budget by more than 200%–sometimes as much as 400%. Industry pundits call these projects “Black Swans.”

“Disasters are avoidable, though,” Colin reassured readers. “Implementing a quality management solution will help insurers introduce new products and get them to market in a timely fashion while still allowing for the rigorous testing that prevents glitches and compliance issues.”

Testing prior to a product launch is essential, he advised, told insurers to pay heed to five testing tips:

1. Validate all IT and business data and technology.
2. Cover every aspect of a product launch.
3. Test everything that could go wrong.
4. Test the live environment.
5. Use technology to ensure your software and website quality processes are fast and efficient.

Read more of Colin’s article in Insurance & Technology here


Unconsciously falling foul of the regulators – website mis-selling

By George Wilson

The mis-selling of financial services products, PPI (payment protection insurance) in particular, has blighted our insurance and banking landscape for years. The figure for consumer compensation for PPI now reaches a breathtaking £20bn, marking it out as the UK’s worst ever consumer scandal. The industry is now relatively switched on in terms of ensuring all employees are toeing the company line on how to sell to customers in a fair and honest way, as opposed to keeping one eye on their end of month bonus targets.

Mis-selling

However, there have been a number of incidences where companies have received fines as a result of incorrect or misleading information on company websites. Easyjet and Ryanair have been fined recently by Italian regulatory authorities for incorrect insurance policy information on their website. Website mis-selling is a tricky area for financial services companies. Clearly the mis-selling itself is typically not intentional, but any company can be found guilty of displaying wrong or misleading information on their website that leads a consumer into making a wrong or misplaced purchasing decision.

And this is not compliant with FCA regulation. The FCA itself has referred to mis-selling in the context of “consumer detriment arising from the wrong products ending up in the wrong hands, and the detriment to society of people not being able to get access to the right products.” If the consumer is exposed to the wrong information, then this fits with the FCA’s stance on the issue.

The problem for any company, whether they are a financial services company or not, is that website glitches causing the display of wrong data are usually caused by either human and/or technical problems and are often connected to data validation, testing and quality assurance. Changes made on a website, a software upgrade or a patch for example, can cause anomalies in the website and alter the data that is displayed. One change of code, or even data messed up in a product manager’s spreadsheet, could have repercussions through the site. So data owners might be guilty of mis-selling when the action was entirely accidental.

But there are strategies that can help. Automated testing and validation solutions aimed at maintaining ‘business as usual’ can run thorough content, checking every update and flagging up glitches of all types immediately, so when financial services providers engage with consumers through their website, they can do so with confidence.

The problem is that if glitches do happen and consumers are mis-sold financial products as a result, then the insurance companies and banks are the ones who are liable, even if the reason the mis-selling took place is accidental. And the FCA does not shy away from imposing heavy fines on companies. So to keep on the right regulatory track as far as mis-selling is concerned, FS companies would do well to get their website testing and quality houses in order.


As Retailers Boost IT Spending, Be Mindful of the Minefields

By George Wilson

Research out this week has earmarked the retail sector for soaring IT investment in 2014, with websites, mobile and IT system replacement being top of their wish lists. Law firm TLT has found that two thirds of the UK’s top 60 retailers expect their firms to grow this year and 80 per cent were convinced that IT will be instrumental in driving sales.

Mobile is the technology that retailers are getting most excited about, with two thirds planning to invest this year. And more than half of retailers plan to invest in their e-commerce platforms in 2014, in order to help them keep up to date with the rising tide of online shopping.

Retails Boost IT SpendingThe fact that retailers are embracing technology and ensuring that it is pivotal to their business growth strategies is to be applauded. But there is definitely a note of caution that retailers must heed if they are to plough more investment into IT.

For a start, both mobile and e-commerce are rapidly evolving, so retailers need to ensure their approach to delivering these apps enables them to make changes to content and functionality frequently and rapidly. This will allow them to experiment with offerings and customer experiences, make changes according to what works and what doesn’t, as well as respond quickly to competitor innovation.

But the caution is also about disaster prevention. In the last year, there have been some big IT disasters for retailers. Back in October, US retail behemoth Walmart fell victim to data discrepancies on its website, angering thousands of customers, causing a PR disaster and damaging its share price. A data glitch on the e-commerce site saw expensive electronic items, usually valued at over $500, on sale for $39.99. Bargain hunters swarmed to the Walmart site to snap up cheap products, but on realizing its mistake, Walmart cancelled purchases. And wasn’t the first time for Walmart. Weeks before, the company had problems with its food stamp loyalty systems, enabling shoppers to load up their online shopping carts with hundreds of dollars of free items.

UK based Screwfix.com also encountered problems earlier this year when all of the items on its website – including expensive power tools and sit on mowers – were mistakenly lowered to £34.99. Screwfix.com responded by cancelling orders, angering customers.

The lesson learned from this type of IT malfunction is that a proper and thorough approach to quality assurance and testing – the part of the IT process that ensures that systems, applications and websites are fit for purpose – is vital. Investing a lot in software and whizzy new applications might seem obvious and retailers might think of cutting corners elsewhere. But QA and testing – particularly validation testing where anomalies are picked up as a result of changes being made to other systems and applications – is not the place to slash the budget.

Tech disasters can cost retailers dearly. Customer loyalty, brand value, reputation and share price can all take a hit when a technology catastrophe hits. So you can’t put a price on the investment of spending time thinking these new technologies through and ensuring that all bases are covered when the button is pressed and these new applications go live. Retailers, tread carefully.


Toyota – software glitch leads to global product recall

By George Wilson

Toyota, the Japanese car giant, suffered a massive blow this week when it was forced to recall almost two million of its top selling Prius hybrid vehicles. A glitch in the cars’ software could set off cars’ warning lights, meaning it would enter failsafe mode and cause the vehicle to stop suddenly. The biggest hit to Toyota will be in Japan and the USA.

Failsafe mode means STOP NOW!

Of course, the real risk in this scenario is to drivers’ safety. But the corporate challenge for Toyota is not insignificant. The reputational damage to the brand is considerable – most environmental car purchasers might think twice before buying a Prius. This isn’t the first time this has happened for the Prius – weeks ago, US Prius models were recalled for faulty seat heaters. And in 2009, millions of Toyota models worldwide were recalled due to acceleration issues, which hammered Toyota’s share price.

So how was something as fundamental as a software issue to blame this time? And how were millions of Prius models released with this software glitch?

Of course, it’s only conjecture at this stage, but it might have been that the requirements for the software were not properly defined, or the integration between the different modules was not properly defined or tested.

People in the know might blame the testing – how was this software released with such a fundamental flaw? But testing is always based around testing the requirements. And if they got those wrong, or missed something in the design, then testing will be examining the wrong parameters.

Following Toytota’s acceleration issue five years ago, a number of court cases sprang up that found its electronic throttle system was flawed. The company had performed a “stack analysis” but had completely botched it in the words of the ruling, meaning software defects were the cause of a number of accidents.

Obviously, in this case, software defects actually cost lives. And in the automotive world, the risk is ultimately to people’s safety. In reputational terms for the car manufacturers, it also costs them dearly. So the message, again, is one that is clear and simple – technology processes have to be clearly defined, properly executed and tested, tested and tested again.


Data Glitches – how Screwfix.com got it wrong

By George Wilson

The front page of the Telegraph this week carried a story on DIY online retailer, Screwfix.com. Shoppers couldn’t believe their luck when the retailer – selling everything from sheds to pricey power tools – cut all its prices to £34.99. Word of mouth meant people piled on to the site eager to snap up a bargain. One customer couldn’t believe his luck as he bought a ride on mower, usually priced at £1600.

Some customers who had arranged to pick up their purchases first thing on Friday were lucky, but others found their purchases had been cancelled and were reimbursed, as Screwfix and its parent company, Kingfisher PLC, which also owns B&Q, realized the mistake.

It does involve a bit of guesswork to figure out why this happened, but the runaway likelihood is that it was a data validation error. No doubt there will be an intensive investigation to identify the cause, but these things are not always IT problems.

Website validation can be a real problem for retailers and their e-commerce sites. Changes to a website can cause all manner for problems and can skew the data that is visible on the site. For example, a software upgrade or patch to a system can cause anomalies within a website and not necessarily to the section that has been changed. One change of code, or even data messed up in a product manager’s spreadsheet, could have repercussions in seemingly unaffected areas of the site. Walmart had a similar issue back in October.

So how realistic is it for retailers to validate every part of their site every time a change happens? IT teams often make a call on how extensive regression testing should be – but resources dictate that it’s impossible for everything to be tested. Once a system is live, the emphasis shifts to the business users who are responsible for the data – but they usually won’t have access to the automated testing solutions their technical colleagues use.

There are strategies that can help e-commerce providers like Screwfix.com. Automated testing and validation solutions aimed at maintaining ‘business as usual’ can run thorough content checking after every update flagging up any detected glitches immediately – this means that when retailers press the button on changes, patches, or upgrades, they can go live with more confidence. And validation isn’t just carried out before the site goes live – it should be an integral and ongoing part of any e-commerce website.

When problems like this occur, the fall out isn’t just having some bad high profile publicity and disgruntled customers. Investors often get spooked by IT failures and bad business practice and it can have a negative impact on a company’s share price. Making sure they have good governance in place and sound quality assurance measures bode well for online retailers.


Obamacare Website Malfunction – the Cost of Not Testing

Obamacare was not tested

If there is anything that President Obama has learned in the last month, it’s the importance of software testing. The Healthcare.gov website, an insurance shopping site for the US government’s flagship health policy “Obamacare”, has repeatedly been dogged with problems.

Glitches within the site have caused misery for tens of thousands of would-be insurance policy purchasers, with many enduring lengthy waiting times. The site has also had a number of complete outages. This has proved massively frustrating for the millions of Americans who want cover from January – they have to sign up by December 15th – meaning many of them have to resort to apply using more traditional methods: by phone; post; or in person.

Congress is trying to unpick why the site degenerated into chaos following its launch at the start of October. It’s all turned into a bit of a blame game, with contractors blaming government officials who are pointing the finger at the contractors. Meanwhile the Republicans are rubbing their hands with glee.

The key reason came to light last week when contractors admitted that the health department did not fully test the site until late September, the week before go live. So, unfortunately for Obama, this goes to serve as a lesson for every customer-facing organization in the world, the importance of testing and the cost to business, reputationally or otherwise, when technical defects cause havoc.

It’s easy to point the finger at testing – or a lack of – now that it’s been publically admitted. And leaving the testing phase of the project until the very last minute always meant that there would be serious problems. But it is likely that failed testing was only part of the problem. The Government was accused of adding changes at the last minute, constantly changing the scope of the project so it was difficult to execute the project against these very fluid objectives. Nonetheless, it’s perfectly possible to test effectively in such a fluid environment, and is increasingly becoming the norm.

It would also appear that collaboration and communication between the contractors and the Government officials wasn’t as harmonious as it should have been, probably a result of everyone having to race to meet such a tight deadline. The lesson for organizations embarking on an initiative like this is to plan, properly scope requirements, give realistic timeframes for project completion, provide visibility to stakeholders, and finally, test, test and test again. Sending websites, systems and applications to go live with multiple defects is a recipe for disaster, which has been a painful lesson for Obama and the Democratic Party in recent weeks.


Wal-Mart website glitch offered up electronic bargain for shoppers

US consumers took to social media en masse last week to broadcast an astounding deal they found on Walmart.com. A glitch on the retail giant’s e-commerce site saw a 24 inch high-definition Viewsonic computer monitor, an InFocus IN2124 digital projector and other products on sale for as little as $8.85, when their retail value is usually in excess of $500.

A spokesman for Wal-Mart – according to the Good Morning America site – said the issue had now been resolved and its IT teams are scanning the millions of items on sale on its site to check there aren’t further technical errors causing price discrepancies.

There was speculation that the site had been hacked, but Wal-Mart denied this and blamed a technical defect. And this isn’t the first time – Wal-Mart suffered another problem recently where a glitch in its food stamps system enabled shoppers to load up their online shopping carts with hundreds of dollars of free items.

It involves a bit of wild speculation to figure out why this happened and how these shoppers managed to get such a bargain. But it is clear that the validation of the data used in the website wasn’t as tight as it should be. No doubt there will be an intensive investigation to identify the cause, but these things are not always IT problems.

Website validation is a real problem for retailers and their e-commerce sites. For example, a software upgrade or patch to a system can cause anomalies within a website and not necessarily to the section that has been changed. One change of code, or even data messed up in a product manager’s spreadsheet, could have repercussions in seemingly unaffected areas of the site. In Wal-Mart’s case, it was their pricing structure.

So how realistic is it for retailers to validate every part of their site every time a change happens? IT teams often make a call on how extensive regression testing should be – but resources dictate that it’s impossible for everything to be tested. Once a system is live, the emphasis shifts to the business users who are responsible for the data – all too often these groups are left as technology paupers, without the benefit of the automated testing solutions their technical colleagues use. But there are strategies that can help e-commerce providers like Wal-Mart. Automated testing and validation solutions aimed at maintaining ‘business as usual’ can run thorough content checking after every update flagging up the glitches of all types immediately – this means that when retailers press the button on changes, patches, or upgrades, they can go live with more confidence.

Wal-Mart has apparently refused to honor their customers’ discount purchases, trying to appease them with $10 vouchers instead. Cold comfort for those who thought they had got the bargain of the century and means this hitch has had a negative impact on Wal-Mart’s reputation. Of course, this isn’t the first time a technical fault has damaged a company’s reputation, but to protect their name and their customer satisfaction levels, it’s vital that retailers – as far as they can – ensure e-commerce websites are as defect free as possible.


Windows 8.1 update glitch stops RT starting up

Microsoft Surface RT gadget owners were left frustrated this week as the Windows 8.1 upgrade stopped their devices working properly. Customers also complained that the 8.1 update for Internet Explorer didn’t work with Outlook and other Google services.

Microsoft put the onus on Google, saying that changes it had made to its search function were responsible. But Microsoft has since back tracked and made changes of its own to fix the problem. It has also doled out advice on how to get IE11 working with web-accessible versions of its Outlook email program, which failed to work with the new version of the browser. Microsoft has withdrawn the update from its website whilst it investigates the issue.

This is another example in a long line where companies compromise the quality of their products by failing to conduct the appropriate level of testing. Why this is could be for a number of reasons.

One could be that Microsoft has fallen over itself to respond to market demand for new features and functionality and in doing so has not properly tested the new elements. For example, the “home” button has made a re-appearance in version 8.1, being absent from version 8. A failure to test this new component across the appropriate range of mobile devices could have caused the problems.

Microsoft’s haste to get this new version of its operating system to market might also have caused them to cut corners in the testing process. The impatience in the race to market and get an edge on the competition can all too often mean companies compromise the quality of their product.

Another potential issue is the whole mobile environment. The fact that companies now have to test the efficacy of software developments or upgrades across mobile phones, lap tops and tablets means the margin for error increases exponentially as mobile adds further layers of complexity to the testing of these products.

Whatever the reason behind the Microsoft 8.1 failings, the fact remains that there were serious failings in the testing strategy. The knock-on effect was that version 8.1 was released with technical defects, which in turn angered and inconvenienced customers who took to social media to voice their criticism.

Was the negative fall-out worth the corner cutting? We’d say that it wasn’t. Companies compromise quality at their peril.


5 Reasons why ignoring Software Quality could be your downfall

Software failures are costing companies significant amounts of money and damage to their brand, people are losing jobs and in some cases their liberty because of avoidable software failures!
Read our “5 Reasons Why Ignoring Software Quality Could Be Your Downfall2″

Times are tough. You know it, I know it, your competition knows it and your boss definitely knows it. With greater competition, more consumer choice and a greater dependency on technology than ever before, every organisation faces the challenge of building quality IS systems to support rapidly evolving business requirements. New applications need to get out the door quicker than ever, and you know that the new application upgrade will be hot on its heels. Time is money and IS departments are constantly looking for ways to cut down their application time to market. On average 40% of total application delivery time taken up with testing, so is it no wonder that this is an area marked out for special attention.

Why does testing take so long? Is the question often thrown at the QA department. Well it’s a highly manual process — even though some departments may use automation tools, it is still a time consuming activity. There is a skill in balancing the required resources, with time pressures, and software quality. Original Software refers to this as the Quality Conundrum, and it is something that IT management need to get to grips with. And fast.

You want better quality software? Right, well it will take more time to get right. You want the application live next week? OK, but we will either need to double the team of testers or we will only complete 75% of testing and the quality will be compromised. We don’t have any more budget for more resources? Well there is a decision to be made between an on-time application or a high quality application—you can’t have both.

A familiar conversation? This is the Software Quality Conundrum!

Ignoring the issues surrounding software quality won’t make it go away. Understanding corporate attitudes and distinct advantages to getting software quality right and the problems that can arise if things go wrong, is an important first step in the right direction, and to help you on this journey, read our “5 Reasons Why Ignoring Software Quality Could Be Your Downfall“.


Top 10 Software Failures of 2011

We came across an interesting blog this week. Its content gives the folks at Original Software a multiple number of examples that make it to our “Software Testing Hall of Shame” as 2011 draws to a close.

Phil Codd, Managing Director at SQS, lists ten software failures, as voted by Consultants. We think it’s great as it highlights the damaging effects of inadequate software testing and the ramifications on the company at fault.

So go on folks, have a read and make sure you make application quality a number 1 priority for your application delivery team in 2012!  “Top Ten Software Failures of 2011″

Have a happy holiday season everyone and may 2012 be risk free and healthy for both you and your software.

 


Older Posts »

NEW FREE OFFER

Offer ends 31st August 2014 - Free business process documentation tool open to 5 private sector organisations in the USA and a further five in the UK, with a minimum of 1,000 employees.

First Name
Last Name
Email
Job Title
Phone
Company Name
Postal Code
Country
Company Size
Company Type
Free Offer