OpenAI could fast become a money pit for investors
OpenAI will soon have to compete on a more level footing with others in the space, after its year-long head start
Sign up today and you will receive a free copy of our Future Focus 2025 report - the leading guidance on AI, cybersecurity and other IT challenges as per 700+ senior executives
You are now subscribed
Your newsletter sign-up was successful
OpenAI has become one of the most recognized names in the field of AI, and the company’s valuation has only grown since the release of ChatGPT nearly one year ago. But it’s clear that the company has a long way to go before it can become self-sufficient - or even turn a profit.
As one of the first firms to make generative AI models publicly available, OpenAI has enjoyed a year in the sun in which it raised a reported $10 billion investment from Microsoft and released a number of popular generative AI services.
In October, The New York Times reported that OpenAI was in active talks on a deal that could balloon the company’s value to upwards of $80 billion.
Yet for all its success so far, OpenAI has continued to turn a loss even with the billions of dollars that Microsoft has poured into the company being used to build hardware used to train LLMs like GPT-4.
Sam Altman, CEO at OpenAI, has previously complained about the immense cost of running services such as ChatGPT. In the past year, the company has released a number of subscription tiers for its services to limit usage and recoup some of the computing costs such as ChatGPT for Business or the release of GPT-4 for those subscribed to ChatGPT Plus.
Building generative AI platforms is costly and this has been a key factor behind the relatively slow pace of on-prem AI uptake compared to the meteoric rise of AI platforms in the public cloud. Hyperscalers can afford to shell out the immense investment necessary to stockpile the GPUs and CPUs necessary for immense AI workloads.
Nvidia’s newest GH200 Grace Hopper chips were designed with tomorrow’s generative AI models in mind, but will be a costly investment to say the least. Though Nvidia hasn’t stated how much it costs, the chip can vastly outperform Nvidia’s H100 range, which costs $40,000 apiece.
Sign up today and you will receive a free copy of our Future Focus 2025 report - the leading guidance on AI, cybersecurity and other IT challenges as per 700+ senior executives
While Nvidia continues to cash in on demand for AI chips, threatening the dominance of AWS in the process, this upwards trend in model size isn’t nailed on. If we’re headed into an era in which trillion-parameter LLMs can consistently outperform those we have today, OpenAI will be locked into investment with Nvidia, or a competitor in the space such as AMD.
Reports have suggested that OpenAI is investing in its own silicon in an attempt to reduce its dependence on volatile chip supplies, but this is a long-term solution to the problem and would put the firm in an uncertain position against those with decades of experience in the semiconductor sector.
AI setup is a pain point for businesses looking to take advantage of the technology, with the costs described as “prohibitive” by analysts. It could be that after the boom of the past year, firms look to reassess their stake in AI, and either stick with public AI or double down and invest more heavily in private solutions.
RELATED RESOURCE
Get tips that will turn your workforce into a talent force
OpenAI could make either work, but in its current shape would much prefer customer loyalty in the public cloud. Its models are a pinnacle of Azure’s AI infrastructure, and its current appeal helps Microsoft almost as much as Microsoft’s funding plugs the holes in OpenAI’s business model.
Smaller models deployed at the edge may well be the future, and OpenAI would certainly be able to capitalize on selling digestible access to its own models to run in this off-the-shelf manner. But this would also force it to compete more aggressively with the likes of Meta’s Llama 2 and yield its significant Azure advantage.
Altman’s ambitions of creating artificial general intelligence (AGI), a system that can realistically rival the intelligence of humans, is so far little more than an expensive pipe dream. If he carries on down that path, other firms with substantial hyperscaler backing such as Anthropic – into which AWS has poured $4 billion – could seize the lead for enterprise AI.
None of this points to certain failure for the firm. OpenAI could - and in all likelihood will - continue to grow in valuation, with Microsoft’s heavy backing and reliance on the firm’s models for its AI assistants such as 365 Copilot, Copilot for Windows, and GitHub Copilot Enterprise.
Google DeepMind turned its first profit in 2020, a decade after it was first founded.
At the time, the firm’s main expenses were in staff salaries. But as the world of AI has become more focused on intensive training of foundation models, computing and hardware investment has become a more serious drain on finances.
If it’s not careful, OpenAI could quickly become something of a passion project for Microsoft.

Rory Bathgate is Features and Multimedia Editor at ITPro, overseeing all in-depth content and case studies. He can also be found co-hosting the ITPro Podcast with Jane McCallion, swapping a keyboard for a microphone to discuss the latest learnings with thought leaders from across the tech sector.
In his free time, Rory enjoys photography, video editing, and good science fiction. After graduating from the University of Kent with a BA in English and American Literature, Rory undertook an MA in Eighteenth-Century Studies at King’s College London. He joined ITPro in 2022 as a graduate, following four years in student journalism. You can contact Rory at rory.bathgate@futurenet.com or on LinkedIn.
-
Microsoft has a new AI poster child in Anthropic – and it’s about timeOpinion Microsoft is cosying up to Anthropic at a crucial time in the race to deliver on AI promises
-
Will AI hiring entrench gender bias?ITPro Podcast This International Women's Day, it's more important than ever to consider the inherent biases of training data
-
Why Amazon’s ‘go build it’ AI strategy aligns with OpenAI’s big enterprise pushNews OpenAI and Amazon are both vying to offer customers DIY-style AI development services
-
February rundown: SaaS-pocalypse now?ITPro Podcast Geopolitical uncertainty is intensifying public and private sector focus on true sovereign workloads
-
‘A huge vote of confidence’: London set to host OpenAI's largest research hub outside USNews OpenAI wants to capitalize on the UK’s “world-class” talent in areas such as machine learning
-
Sam Altman just said what everyone is thinking about AI layoffsNews AI layoff claims are overblown and increasingly used as an excuse for “traditional drivers” when implementing job cuts
-
OpenAI's Codex app is now available on macOS – and it’s free for some ChatGPT users for a limited timeNews OpenAI has rolled out the macOS app to help developers make more use of Codex in their work
-
Amazon’s rumored OpenAI investment points to a “lack of confidence” in Nova model rangeNews The hyperscaler is among a number of firms targeting investment in the company

