Is software optimization a lost art?

With the rise in AI-generated code, software complexity, and constrained compute, could software optimization be making a comeback?

Software security concept image showing binary code snippets with some highlighted on a digital interface.
(Image credit: Getty Images)

Software bloat has shifted from a technical annoyance to a measurable business risk. This is according to analyst firm IDC, which believes organizations are now facing a ‘complexity tax’ that directly erodes revenue and stalls innovation.

And the problem looks set to only get bigger. IDC forecasts that by 2028 there will be over one billion net-new applications – that’s in addition to the software that businesses are already struggling to manage.

Almost all of us have noticed apps getting larger, slower, and buggier. We've all had a Chrome window that's taking up a baffling amount of system memory, for example. While performance challenges can vary by organization, application and technical stacks, it appears the worst performance bottlenecks have migrated to the ‘last mile’ of the user experience, says Jim Mercer, program vice president, Software Development, DevOps and DevSecOps at IDC.

This is specifically the browser – e.g. the ‘Tab Tax’ – and the integration layers between fragmented SaaS tools, he notes, adding that in the age of AI, we’re also increasingly seeing challenges with inference latency.

“Commercial pressure has created a ‘velocity-over-quality’ challenge,” he says. “While architectural decisions and developer skills remain critical, they’re too often compromised by the need to integrate AI and new features at an exponential pace. So, [software bloat’s down to] a lack of due diligence when we should know better.”

In the open source world, the legacy challenge is also adding to bloat, notes Amanda Brock, CEO at OpenUK. “This is inevitably a growing challenge over time. I think it’s harder and harder for open-source projects to cope with the volume of contributions they’re getting, and AI is leading to larger amounts of code getting submitted.

“Without adequate funding and more skilled people, ecosystem maintenance suffers and important hygiene work, like stripping out superfluous code, is routinely deprioritized.”

AI – a double-edged sword

AI tools sit at the core of the debate around software bloat and optimization, seemingly bringing as many downsides as it does benefits.

In the short term, AI-generated code is likely to increase software bloat, as it optimizes for working, rather than efficient code, says Nell Watson, IEEE senior member, author and AI ethics engineer at Singularity University. “It tends toward completeness, handling every conceivable edge case, defensive patterns everywhere.

“There’s no feedback loop where AI learns from runtime performance or user complaints about sluggishness. Generating more code faster with less human scrutiny isn’t a recipe for lean software.”

The somewhat concerning part is that AI bloat is structurally different from traditional technical debt, she points out. Rather than accumulated cruft over time, it usually manifests as systematic over-engineering from day one. “That can make these deeply embedded, and much harder to identify and remove, especially in an era of ‘vibe coded’ systems.”However, the flipside is that it makes porting mission critical infrastructure to more secure and race-ready, resistant stacks such as those based on Rust far easier, she adds. This is especially true for translating more obscure languages in legacy systems, “which counteracts some of these issues with bloat and maintainability”.

Data center challenges will reshape software design

In the long-term, the growth of AI is expected to lead to new tools that will help streamline software applications. “New markets for code modernization and review will help reduce maintainability concerns,” says Jim Scheibmeir, VP analyst at Gartner. “However, this means more tools in the toolbox and additional costs for modern software engineering licenses and subscriptions,” he adds.

Software optimization has become even more important due to the recent RAM price crisis, driven by surging demand for hardware to meet AI and data center buildout. Though the price increases may be levelling out, RAM is now much more expensive than it was mere months ago. This is likely to shift practices and behavior, Brock explains:

“When I was in China late last year, there were conversations around how developers there are obsessed with efficiency in the way they write code and build AI, largely because they have less access to compute.

“I think that’s going to have a real impact on how things play out. If you look at the bigger picture, we’re now in an arms race around data centers and getting enough compute and – crucially – enough power to run them. That’s going to be hugely problematic.

“Because of that, the leaner and more efficient the software is, the better it’s going to perform. The same applies to AI, and we’ll definitely see more of a shift towards small language models (SLMs). I think data center capacity, and the limits around energy and infrastructure, are going to be a key driver in shaping what our software looks like going forward.”

The importance of software hygiene

Looking forward, Brock believes that software hygiene is only going to become more important, “and we’re really only just starting to look at it,” she says.

“I know people who are rewriting parts of code to make them more secure in Rust, or to run slimmer, leaner operating systems so they can deploy data centers in emerging markets like Africa. If you have the right OS, you can reduce a data center down to the size of a cupboard and roll it out very quickly.”

Security will play a role too, particularly with the growing data sovereignty debate and concerns about bad actors, she notes. Leaner, neater, shorter software is simply easier to maintain – especially when you discover a vulnerability and are faced with working through a massive codebase.

“Questions around security, sustainability, access to technology, the efficiency, cost and availability of compute, and data center capacity and energy all feed back to the same point. They ultimately push upstream to the software itself – how it’s written and how it’s maintained. I think pressure to streamline software will be universal.”

To de-risk your software estate and protect performance over the coming years, Mercer recommends using platform engineering to help with governance. “Also look towards software normalization and vendor consolidation, and consider unified platforms where possible,” he says.

It also comes down to embedding good practices, Brock adds. “It’s about creating internal discipline among users and project creators: being conscious about not adding superfluous software and making the time to apply real rigor in removing historic bloat. That means recognizing that this genuinely is an issue,” she concludes.

Keri Allan

Keri Allan is a freelancer with 20 years of experience writing about technology and has written for publications including the Guardian, the Sunday Times, CIO, E&T and Arabian Computer News. She specialises in areas including the cloud, IoT, AI, machine learning and digital transformation.