Having cut through to consumers and businesspeople alike, most could answer "what is ChatGPT" on some level, though under the surface there are many more questions about ChatGPT that may need answering. Since its release on November 30 2022, ChatGPT has become a byword for generative AI and its developers OpenAI have increasingly pitched it as relevant for business use.
With the generative AI market set to expand to $109 billion US (approximately £91.4 billion) by 2030 according to Grand View Research, these tools are becoming more and more important in the enterprise landscape. Usage of the tool remains strong no signs of slowing, even as competitors such as Amazon Q, Google Bard and Anthropic’s Claude 2.1 have entered the market.
ChatGPT updates and summary
What is ChatGPT?
ChatGPT is OpenAI’s large language model (LLM) chatbot. It is powered by the model GPT-3.5 for the free tier, while paid ChatGPT subscribers can use a version powered by OpenAI’s flagship model GPT-4. The company calls ChatGPT a “sibling” of their InstructGPT model, which is the default language for OpenAI’s API.
ChatGPT’s base model was trained on vast quantities of data scraped from across the web, including entire web pages and books. This was then fine-tuned using reinforcement learning from human feedback (RLHF). That process, from the organization’s perspective, was about ensuring responses were as relevant as possible, removing app bias, and reducing untruth in output otherwise known as AI ‘hallucinations’.
“We needed to collect comparison data, which consisted of two or more model responses ranked by quality,” OpenAI says. “To collect this data, we took conversations that AI trainers had with the chatbot. We randomly selected a model-written message, sampled several alternative completions, and had AI trainers rank them. Using these reward models, we can fine-tune the model using Proximal Policy Optimization.”
What does ChatGPT stand for?
While the ‘Chat’ part of the tool’s name is self-explanatory, ‘GPT’ stands for ‘generative pre-trained transformer’. This is the specific neural network framework used for generative AI models that conform to the transformer architecture.
GPT models are used to break natural language prompts down into representations of the words themselves known as vectors. This is used to process the context of the sentence and map the connections between the words it contains, which are used to inform output. In simple terms, the model predicts the most likely words to come next in the sentence based on its training.
What are the potential use cases for ChatGPT?
ChatGPT is capable of producing detailed text output based on a prompt, as well as code creation and modification, answering queries using the internet, or summarizing long content. ITPro used it to produce some astoundingly bad Christmas cards in 2022, but the models behind it have been gradually improved since then and ChatGPT has found its way into business to a major degree.
Since its launch, OpenAI has released more versions of ChatGPT aimed at business and enterprise use. ChatGPT Enterprise is a plan for the tool that comes with AES-256 encryption for data passed to it, as well as more advanced data analysis capabilities.
For one interesting example of an area in which ChatGPT can flourish, we can look at research coming out of Drexel University. A study released in December 2022 found the same natural language processing (NLP) techniques ChatGPT uses can be deployed to identify Alzheimer’s patients.
How much does ChatGPT cost to run?
Although OpenAI hasn’t publicly revealed its costs per query OpenAI CEO Sam Altman, who left OpenAI for Microsoft in November 2023 only to return triumphantly days later, previously stated in a post on X that “the compute costs are eye-watering”.
Solve global challenges with machine learning
Tackling our world's hardest problems with ML
At CES 2023, Microsoft announced a $10 billion investment into OpenAI, with the hyperscaler having since made OpenAI’s GPT-4 a core part of its Copilot range of AI assistants including Copilot for Microsoft 365 and Copilot Studio.
For OpenAI’s part, the company has released detailed pricing outlines for its models. Those wishing to run inference on GPT-3.5 Turbo can expect to pay $00.10 per 1k input tokens, while those who want to access the power of GPT-4 will pay the higher price of $0.03 per 1k tokens.
Why is ChatGPT so controversial?
Alarm bells are ringing across the world, from the creative sector to academia given the capacity for tools like ChatGPT and DALL-E mimic human creativity. Academics and researchers, in particular, are worrying about the prospect of plagiarism. But there are more controversies beyond this.
There is every indication that AI could be a legal nightmare for years to come and as the model at the forefront of attention ChatGPT has already been subjected to criticism and concerns. Italian regulators banned ChatGPT over demands that OpenAI include a ‘right to be forgotten’ for users of the tool and the EU AI Act aims to protect IP from AI in a way that could threaten the web-scraping basis of ChatGPT altogether.
Corporate objections to ChatGPT have also been raised. In May 2023, Apple banned ChatGPT within its offices citing concerns that employees could input sensitive data which could then be leaked and 80% of C-suite executives share these concerns.
Clearly there are still bugs to work out. Research from Purdue University found that more than 50% of ChatGPT’s programming answers were incorrect and warned against overreliance on the chatbot when it came to assessing code.
ChatGPT has an incredibly compelling sales pitch, but it’s still only the beginning for generative AI. Questions around ethical AI will continue to dominate the conversation, especially as the underlying technology evolves and AI-linked job cuts become a risk. A recent study by Boston Consulting Group (BCG) suggested that overreliance on ChatGPT could harm worker performance and these concerns will only become more apparent and relevant as the technology beds in.
Get the ITPro. daily newsletter
Receive our latest news, industry updates, featured resources and more. Sign up today to receive our FREE report on AI cyber crime & security - newly updated for 2023.
John Loeppky is a British-Canadian disabled freelance writer based in Regina, Saskatchewan. His work has appeared for the CBC, FiveThirtyEight, Defector, and a multitude of others. John most often writes about disability, sport, media, technology, and art. His goal in life is to have an entertaining obituary to read.