OpenAI launches ChatGPT API for businesses at competitive price

OpenAI's logo, shot from below against blurred purple and white light streaming off the letters (purple on the left and white on the right)
(Image credit: Getty Images)

OpenAI has made its API for ChatGPT generally available, based on a cheaper model that allows developers to easily call on the powerful generative AI for in-app usage.

ChatGPT API accesses a model known as ‘GPT-3.5 turbo’, the same used for the ChatGPT web product. Developers can easily interact with it via a simple endpoint and use it for any of the tasks that ChatGPT is capable of undertaking, from within their website or application.

Leaning on its agreement with Microsoft, the model runs on Azure compute infrastructure which connects to user endpoints. This allows it to run independently of server load on the ChatGPT website, offering businesses a dedicated lane for AI processing.

The API will cost firms $0.002 (£0.0017) per 1,000 tokens, a value worth around 750 words, a sum 10 times smaller than other GPT-3.5 models.

OpenAI stated that this was made possible by a 90% cost reduction in ChatGPT since December, without going into details on how this was achieved.

Enterprise customers seeking reliable access to the model can also purchase dedicated instances, in an agreement in which OpenAI will allocate Azure compute infrastructure solely for the customers’ use.

OpenAI has stated that this arrangement may be the most economical for developers expecting requests in excess of 450 million tokens per day, and can be agreed via direct contact with the company.

OpenAI admitted that it has not met its own targets for delivering a stable service since December, but that it is committed to achieving this over time.

“For the past two months our uptime has not met our own expectations nor that of our users,” read the blog post.

“Our engineering team’s top priority is now stability of production use cases - we know that ensuring AI benefits all of humanity requires being a reliable service provider. Please hold us accountable for improved uptime over the upcoming months!”

Data processed through the API is not used for model training or other improvements to its service unless the organisation chooses to opt-in, and OpenAI has shelved its ‘pre-launch review’ policy which had required developers to flag what they used the model for before it could be integrated within their app.

“Data submitted to the OpenAI API is not used for training, and we have a new 30-day retention policy and are open to less on a case-by-case basis,” tweeted Sam Altman, CEO at OpenAI.


AI for customer service

IBM Watson Assistant solves customer problems the first time


“We've also removed our pre-launch review and made our terms of service and usage policies more developer-friendly.”

The latest model, gpt-3.5-turbo-0301, will be supported until June and a new stable release of the gpt-3.5-turbo is expected in April.

Developers will be given the choice to adopt stable models or specific models according to their needs.

Popular apps that have already made use of the ChatGPT API include Shop, the commerce app by Shopify, which has implemented ChatGPT API to provide more accurate in-app searches and personalised product suggestions for users.

Whisper, OpenAI’s speech recognition system launched in September 2022, was also made available via API at $0.006 (£0.005) per minute of transcribed audio. The open-source model is capable of both transcribing audio and translating speech into English, and can process a number of common audio and video file types.

Through the API, developers can leverage the new ‘large-v2’ model for Whisper which brings speed and quality improvements to its output.

The API offering represents another step towards the full monetisation of ChatGPT, a vital task in the wake of reports that OpenAI’s models are too expensive to run without sizeable income. Altman himself described the firm’s costs as “eye-watering” in December 2022.

Much of this funding may have now been secured through Microsoft’s $10 billion investment in OpenAI, which cemented the dominance of both firms within the AI market. ChatGPT has an ever-expanding presence on Azure and the Redmond giant’s decision to integrate GPT-3.5 into Bing and Edge puts it on competitive footing against Google.

Separate from its influential investors, February saw OpenAI launch its paid tier ChatGPT Plus in the US, offering subscribers faster response times, stable access, and priority updates for $20 (£16) per month.

Some have questioned the cost of ChatGPT Plus in the wake of the API announcement, particularly given that the API allows access to the GPT-3.5 large language model (LLM) at a fraction of that price.

“I hope this pricing impacts ChatGPT+,” wrote a user on the Y Combinator forums.

“$20 is equivalent to what, 10,000,000 tokens? At ~750 words/1k tokens, that’s 7.5 million words per month, or roughly 250,000 words per day, 10,416 words per hour, 173 words per minute, every minute, 24/7. I do not have that big of a utilisation need. It’s kind of weird to vastly overpay.”

Rory Bathgate
Features and Multimedia Editor

Rory Bathgate is Features and Multimedia Editor at ITPro, overseeing all in-depth content and case studies. He can also be found co-hosting the ITPro Podcast with Jane McCallion, swapping a keyboard for a microphone to discuss the latest learnings with thought leaders from across the tech sector.

In his free time, Rory enjoys photography, video editing, and good science fiction. After graduating from the University of Kent with a BA in English and American Literature, Rory undertook an MA in Eighteenth-Century Studies at King’s College London. He joined ITPro in 2022 as a graduate, following four years in student journalism. You can contact Rory at or on LinkedIn.