South African businesses are losing billions of rands due to poor-quality software. And as the economy tightens and skills remain scarce, this is likely to get worse, forcing them to pay to fix avoidable problems and resulting in stalled growth.
According to the Consortium for Information & Software Quality (CISQ), poor software quality cost US organisations almost US$2.1-trillion in 2020. The consortium found operational software failure to be the leading cost contributor, reaching an estimated $1.56-trillion in 2020, a 22% increase since 2018. The cost of unsuccessful development projects, meanwhile, reached $260-billion in 2020, which represented a 46% increase since the previous estimate two years before.
These are truly staggering numbers amounting to around 10% of the country’s GDP that year. Unfortunately, South Africa is likely comparable, if not worse.
In an effort to save costs, local companies often fall for the low-quality code trap. Many will either outsource to low-cost development destinations, or they will just throw more and more developers at the problem. When you commoditise software development and make decisions based purely on cost, the casualty is quality. And too few leaders fully understand the implications waiting for them down the road.
Even as a best-case scenario, organisations running on low-quality software will struggle to release new features, impacting their ability to deliver on customer expectations.
If you are constantly fighting fires just to keep software running, there is no way you can expect to stay on schedule. Nor will you be able to respond to changing market conditions or the changing needs of your customer. Even if you manage to avoid the potential critical system outages, you are likely to experience complaints from frustrated customers. Companies are learning the hard way that software is as essential in the modern company as the sales function.
Many companies believe they can solve the problems of low-quality code by adding more developers to the team, but this approach often fails.
In one case, a company was on a short deadline to roll out new features for its digital offering while they built their in-house developer capacity. However, they conflated the two objectives and they just kept hiring new developers. After 12 months, they still hadn’t launched the features. Throwing people at a problem seldom delivers. What they should have done is brought in a managed services team to deliver a minimum viable product and get them into the market. Then they could have taken the time required to build quality in-house engineering capabilities to take over refining product features.
There are even examples of poor quality software putting the brakes on companies’ global expansion plans.
You can’t build on creaky foundations. Code has a shelf life and, if it’s already poor quality, it will degrade even faster. Not only will it cost you to keep patching it, but while you spend all your efforts on keeping it limping along you are missing out on all the opportunities of building and deploying new revenue streams. The real threat is that when there are technology failures, it often happens in full public view and can end up costing far more than just a few red faces.
- The author, Sergio Barbosa, is CIO of enterprise software development house Global Kinetic and CEO of its open banking platform, FutureBank
Get TechCentral’s daily newsletter