IT Pro is supported by its audience. When you purchase through links on our site, we may earn an affiliate commission. Learn more

What is GPT-4 and what does it mean for businesses?

The next generation of the OpenAI framework - GPT-4 - might change the face of language modelling

The scale and power of artificial intelligence (AI) is improving exponentially, with businesses increasingly able to access an array of cutting-edge tools to implement across their organisations. OpenAI, the non-profit AI research institute, is at the heart of much of this innovation, and the recently released DALL-E 2 image generation platform shows just how powerful AI tools are becoming. For years, OpenAI has also been working on text generation in the form of the Generative Pre-trained Transformer (GPT), an autoregressive language model that uses deep learning to produce human-like text, and its latest iteration, GPT-4, is on the verge of launch.

It’s been two years since GPT-3 was  launched, with this neural network applying machine learning on streams of internet data to generate any type of text, on queue. Fascinatingly, only a small amount of input text is needed to create reams of pertinent and high-level machine-generated text. With OpenAI making its GPT-3 AI model available to everyone towards the end of last year, thoughts have now turned to its successor. The spotlight is now on the next generation of the language model, known as GPT-4, and the massive potential it presents to both businesses and the wider community. 

The model draws on over 175 billion machine learning parameters, which operate as weightings; they are part of the model that is learned from the training data fed into it. This dwarfs GPT-2 which uses 1.5 billion. According to Towards Data Science, GPT-4 will have a monstrous 100 trillion parameters. You can compare that with the human brain, which has about 100 billion neurons – which at the very least illustrates the sheer scale of the model. While we’re not suggesting GPT-4 will be as powerful as the human brain, it could have uses beyond GPT-3’s approach. For example, Oliver Fokerd, a back-end developer at Hallam, says in addition to these trillions of parameters, the input will allow “more symbols (roughly counted as words), so that much larger bodies of text will be consumed and generated”.

Meanwhile, according to an interview with OpenAI’s chief scientist, Ilya Sutskever, such language models will start to become aware of the visual world. “Text alone can express a great deal of information about the world, but it is incomplete, because we live in a visual world as well. The next generation of models will be capable of editing and generating images in response to text input, and hopefully they’ll understand text better because of the many images they’ve seen,” he says.

What GPT-4 means for language modelling

OpenAI has never preannounced launch dates, and has always been quite tight-lipped about features or new releases of GPT-X. According to Peter van der Putten, director of Pegasystems’ AI Lab, and assistant professor in AI at Leiden University, language models have been growing bigger and bigger. Although GPT-3 has around 175 billion parameters, newer dense models such as Megatron Turing NLG and Google’s PaLM have more than 500 billion parameters. “Larger is not always better," he continues, "and GPT-4 might be more focussed on making better use of resources and providing better functionality."

OpenAI has already released improved versions of GPT-3. For example, the InstructGPT models are doing a better job of understanding the user intent – the task the user wants to perform – and at following more explicit instructions. It’s also released Codex, the GPT-based model that generates source code as well as new functions to edit or insert content in code or text.

“Also, investments have been made into AI safety, for example by better flagging and generation of potentially toxic content," Van Der Putten continues. "So, I would actually expect the developments of core technology to be more along these lines than just publishing yet another larger language model."

GPT-4 vs GPT-3

The most immediate and astonishing difference between GPT-4 and its predecessor is that it uses 100 trillion machine learning parameters versus 175 billion used in the current model. 

While GPT-4 will have far more parameters than GPT-3, the technology is also moving away from the notion of “bigger is better”. Finastra’s head of artificial intelligence and machine learning, Adam Lieberman, says he hopes to see a less sizeable increase in parameters and model size in future.

Related Resource

Recommendations for managing AI risks

Integrate your external AI tool findings into your broader security programs

Yellow whitepaper cover with two flying robots, with desktop computers inside their headsFree Download

“We do expect an increase in compute but hope to see enhanced multi-tasking from few-shot learning. With GPT-3, the community saw a lot of success with solid prompting, and we hope to see GPT-4 have more robustness for human made errors in prompting,” he adds.

Fokerd chimes in with this and says that while GPT-3 enabled users to input natural language, it still took a bit of skill to craft your prompt in a way that would give good results. “GPT-4 will be much better at inferring users’ intentions,” he adds.

OpenAI will also be hoping that much of the shortcomings of GPT-3 will be ironed out with the next generation of its model. For instance, GPT-3 was once thought too dangerous to be released to the public because a less mature version was caught generating fake news stories. The model was also accused of exhibiting racism against specific religions and genders.

A more recent version, released earlier this year, uses reinforcement learning from human feedback (RLHF), which uses human helpers called labellers to assist the AI in its learning. It's almost without doubt that such techniques, and more, are being used in the development of GPT-4 to ensure it avoids the same criticism earlier versions of this predecessor were exposed to.

What are the business benefits of GPT-4? 

Lieberman says from code completion to finding tax deductions, GPT-3 showed the community it meant business. The advent of GPT-4 will feed into the growing understanding that AI is becoming less clunky and more humanised.

“With a new and improved version of our GPT language model, we expect to see enhanced use cases across many different domains leveraging the power of language modelling. Use cases where GPT-3 performed sub-optimally have a second shot at the free throw line and we are excited to see all the new use cases that will emerge,” he adds.

For Fokerd, internet users will more likely see a lot more AI-generated content with the advent of GPT-4. “This already happens, but there will likely be an explosion of its usage, enabled by better results. Cyber criminals will inevitably start to make use of the technology, too, making it more difficult to differentiate certain communications.”

For businesses, the benefits will be seen in less time required for day-to-day content creation, plus the possibility of creating previously impossible or very difficult copy, such as essays and full articles.

“The plethora of writing-aid apps available will be able to take even more of the burden away from writers but the flipside to this is that plagiarism will be harder to spot or to prove: with all the automated copy flying around, it could become a more common job to be a proofreader than a copywriter,” adds Fokerd.

Van der Putten said the initial high expectations of an autonomous creative entity may shift more from AI to an assisted, augmented intelligence, much like how AI systems are being used in cyber security. It would work alongside the human and wouldn't just help to get faster at writing code, or crafting better emails, but will also help people get better ideas by providing suggestions.

“This will be a much better fit than using the output of GPT-3 in some automated fashion; we are keeping the human in the loop. It will make our life easier and lower our effort, both factors that have been proven to be crucial in the adoption of any new technology,” he adds.

Featured Resources

The Total Economic Impact™ Of Turbonomic Application Resource Management for IBM Cloud® Paks

Business benefits and cost savings enabled by IBM Turbonomic Application Resource Management

Free Download

The Total Economic Impact™ of IBM Watson Assistant

Cost savings and business benefits enabled by Watson Assistant

Free Download

The field guide to application modernisation

Moving forward with your enterprise application portfolio

Free Download

AI for customer service

Discover the industry-leading AI platform that customers and employees want to use

Free Download

Most Popular

Why convenience is the biggest threat to your security

Why convenience is the biggest threat to your security

8 Aug 2022
UK water supplier confirms hack by Cl0p ransomware gang

UK water supplier confirms hack by Cl0p ransomware gang

16 Aug 2022
How to boot Windows 11 in Safe Mode
Microsoft Windows

How to boot Windows 11 in Safe Mode

29 Jul 2022