Only 13% of firms are tracking their AI energy usage – here’s why that’s a problem
While firms are rushing to adopt AI, they're not keeping on top of the associated power costs
While business leaders say they're concerned about the power demands of AI within their organization, few are managing to monitor it properly.
Seven-in-ten business leaders say they're aware of the significant energy required to train or run AI models, and half are concerned about the energy and efficiency challenges this brings.
Yet despite this, just 13% are monitoring the power consumption of their AI systems.
Only six-in-ten acknowledge that energy efficiency will play a crucial role in future strategic planning, thanks to both cost management imperatives and operational scalability concerns.
Rodrigo Liang, CEO of SambaNova Systems, said the study paints a stark picture of AI adoption, with firms rushing to embrace the technology while failing to manage its energy impact.
"Without a proactive approach to more efficient AI hardware and energy consumption, particularly in the face of increasing demand from AI workflows, we risk undermining the very progress AI promises to deliver,” Liang said.
“By 2027, my expectation is that more than 90% of leaders will be concerned about the power demands of AI and will monitor consumption as a KPI that corporate boards will track closely."
Sign up today and you will receive a free copy of our Future Focus 2025 report - the leading guidance on AI, cybersecurity and other IT challenges as per 700+ senior executives
Among those organizations that have widely deployed AI, more than three-quarters are actively seeking to reduce power usage. Popular approaches include hardware and software optimization, adopted by 40%, adopting energy-efficient processors (39.3%), and investing in renewable energy (34.9%).
For one-fifth of companies, rising power costs are a pressing issue, with 37% experiencing increasing stakeholder pressure to improve AI energy efficiency, and a further 42% expecting these demands to emerge soon.
However, while seven-in-ten leaders recognize the energy-intensive nature of training large language models (LLMs), only six-in-ten are aware of the significant power demands of inference.
This highlights a critical gap, researchers said, as inference workloads are set to dominate AI usage with the scaling of Agentic AI.
RELATED WHITEPAPER
Similarly, the excessive power consumption and prohibitive costs associated with current GPU-based solutions are likely to force many enterprises to seek more efficient alternatives.
This has the potential to fundamentally change the AI hardware landscape to favor solutions that deliver high performance without unsustainable energy demands.
"The rapid pace of AI adoption underscores a critical need for enterprises to align their strategies with the power requirements of AI deployment," said Liang. "As businesses integrate AI, addressing energy efficiency and infrastructure readiness will be essential for long-term success."
Late last year, the International Energy Agency (IEA) found that interactions with solutions like ChatGPT use ten-times more electricity than a standard Google search.
Training a large language model uses nearly 1,300MWh of electricity, the agency found - the annual consumption of about 130 US homes.
And if ChatGPT were integrated into the nine billion searches carried out each day, the electricity demand would increase by 10 terawatt-hours per year.
Emma Woollacott is a freelance journalist writing for publications including the BBC, Private Eye, Forbes, Raconteur and specialist technology titles.
-
Outbossing bossware: the pros and cons of monitoring technologyColumn Is bossware and employee monitoring technology ethical – and is it effective?
-
A massive Cloudflare outage has taken down X and OpenAI – and even bricked outage tracker site DowndetectorNews Web users trying to access X, OpenAI, and creative design platforms have been affected by the Cloudflare outage
-
Businesses finding it hard to distinguish real AI from the hype, report suggestsNews An Ernst & Young survey finds that CEOs are working to adopt generative AI, but find it difficult to develop and implement
-
Some of the most popular open weight AI models show ‘profound susceptibility’ to jailbreak techniquesNews Open weight AI models from Meta, OpenAI, Google, and Mistral all showed serious flaws
-
'It's slop': OpenAI co-founder Andrej Karpathy pours cold water on agentic AI hype – so your jobs are safe, at least for nowNews Despite the hype surrounding agentic AI, OpenAI co-founder Andrej Karpathy isn't convinced and says there's still a long way to go until the tech delivers real benefits.
-
Nvidia CEO Jensen Huang says future enterprises will employ a ‘combination of humans and digital humans’ – but do people really want to work alongside agents? The answer is complicated.News Enterprise workforces of the future will be made up of a "combination of humans and digital humans," according to Nvidia CEO Jensen Huang. But how will humans feel about it?
-
‘I don't think anyone is farther in the enterprise’: Marc Benioff is bullish on Salesforce’s agentic AI lead – and Agentforce 360 will help it stay top of the perchNews Salesforce is leaning on bringing smart agents to customer data to make its platform the easiest option for enterprises
-
This new Microsoft tool lets enterprises track internal AI adoption rates – and even how rival companies are using the technologyNews Microsoft's new Benchmarks feature lets managers track and monitor internal Copilot adoption and usage rates – and even how rival companies are using the tool.
-
Salesforce just launched a new catch-all platform to build enterprise AI agentsNews Businesses will be able to build agents within Slack and manage them with natural language
-
The tech industry is becoming swamped with agentic AI solutions – analysts say that's a serious cause for concernNews “Undifferentiated” AI companies will be the big losers in the wake of a looming market correction
