What is an NPU and what can they do for your business?
What are NPUs, how do they work, and why do 2024's business laptops have them?


READ MORE
Although we call the processors inside laptops and PCs "CPUs", that's far too simplistic. Inside each central processing unit sits a complex web of silicon comprising multiple chips, each with a set job to do. Now, a new type of chip has entered the frame: the neural processing unit, or NPU.
Well, a new-ish type of chip. The NPU has actually been around for some years. So what is an NPU? Why are they suddenly big news? And what difference could they make to business laptops in 2024?
What is an NPU?
Neural processing units are designed for one task: to run AI-related jobs locally on your computer. Take the simple example of blurring a background in video calls. It's perfectly possible for your main processor (or the graphics chip) in your computer to detect the edges of your face, hair, and shoulders, and then blur the background. But this is actually a task best done by an NPU.
The obvious follow-up question: why is an NPU better at this job? The main reason is that it's designed to run multiple threads in parallel and it thrives on large amounts of data. Video fits snugly into that description. So rather than a general-purpose processor – that is, your CPU – needing to solve the problem using algorithms and sheer brute force, it can offload the task to the NPU.
And because the NPU, with its lower power demands, is taking care of this task, it both frees up the CPU for other tasks and cuts down on the energy required.
What tasks can NPUs tackle?
While NPUs can tackle many tasks, the big one is generative AI. If you've used a service such as ChatGPT or Midjourney, then all the processing work happens in the cloud. AMD and Intel are betting big on what they're calling "AI PCs", where the workload shifts to the notebook or desktop PC instead. This isn't merely theory: it's perfectly possible to create AI art and ChatGPT-style conversations directly on a PC right now.
You don't even need a computer with an NPU inside to perform local generative AI tasks. For example, you can download Meta's Llama 2 open-source large language model today and run it on your computer. However, Llama 2 will be noticeably quicker and require less power on a PC with an NPU inside.
Sign up today and you will receive a free copy of our Future Focus 2025 report - the leading guidance on AI, cybersecurity and other IT challenges as per 700+ senior executives
Why are NPUs suddenly hitting the news?
NPUs are not new. We've seen them in phones since 2017 when the Huawei Mate 10 used its AI-assisted Kirin 970 processor to perform automatic language translation when users pointed the phone's camera at text. The following year, Apple baked NPUs into its A12 processor, as found in the iPhone Xr, Xs, and Xs Max (and every iPhone since).
RELATED RESOURCE
Discover the benefits that a server refresh can have on your organization
DOWNLOAD NOW
In fact, if you bought a premium phone in the past five years, chances are that it included NPUs. They will help with everything from adding bokeh effects and analyzing songs to speech recognition.
The reason why everyone is suddenly shouting about NPUs is because both AMD and Intel have released notebook processors that include neural processors. You'll find them in most of AMD's 7040 series and 8040 series chips (look out for "Ryzen AI"), while Intel announced in December that its Core Ultra processors also include NPUs.
There's a certain amount of marketing spin going on here, but both AMD and Intel are now pushing the concept of "AI PCs". That is computers – laptops in particular – that feature chips with NPUs inside.
What are AI PCs and when will they arrive?
AI PCs look identical to normal PCs. They're laptops, they're desktop machines, they're gigantic workstations. (Or at least they will be, once the chips are mainstream.) The only difference is that the processors inside include neural processors.
You can buy AI PCs right now, but you will have to hunt for them. For instance, the latest versions of Lenovo's Yoga Pro 7 14in and IdeaPad 5 Pro (14in and 16in) all feature AMD's Ryzen 7940 series chips with Ryzen AI inside. As do the most recent Acer Swift X 16 and Asus Vivobook Pro OLED models.
And you will find AMD's most powerful Ryzen 9 7940HS inside many high-end gaming laptops released in the second half of 2023. Laptops with Intel's Core Ultra chips will go on sale imminently, with many set to be announced at CES next week. Neither Intel nor AMD have yet announced desktop processors that feature NPUs, but we expect this to change during 2024.
How can NPUs help your business?
The biggest benefit of moving generative AI from the cloud to a local PC is privacy. Do you really want to upload your company's sensitive data to a nameless server? Microsoft has already announced that a version of its Copilot technology will run on local PCs in 2024, although details remain sparse.
There are latency advantages too, which is often the case when a task can be completed on a local system. Then there's the money factor. Often, generative AI services require a monthly subscription; switch to your processor, rather than one running in a data center, and that cost may well be reduced to zero.
Of course, all this is meaningless without services to choose from. This is why both AMD and Intel are wooing the likes of Adobe and DaVinci to offload tasks to NPUs rather than CPUs and graphics cards.
However, we have yet to hear of anything radical and new that can only be done on an NPU. At the moment, the chief benefits center around reducing the load on the main CPU. Perhaps 2024 will see a killer app that requires AI PCs, but we expect this to be – forgive the cliché – more about evolution than revolution.
What are the limitations of NPUs?
Without wishing to end on a downbeat note, it's important that business leaders are realistic about what can be achieved on standalone devices. For the moment, large language models on AI PCs will feature fewer parameters than the LLMs we see in the cloud because they simply don't have the processing heft.
The biggest hurdle, however, is availability. Until NPUs become ubiquitous on laptops and desktop PCs, we don't expect to see an explosion in software and services built to take advantage.
Tim Danton is editor-in-chief of PC Pro, the UK's biggest selling IT monthly magazine. He specialises in reviews of laptops, desktop PCs and monitors, and is also author of a book called The Computers That Made Britain.
You can contact Tim directly at editor@pcpro.co.uk.
-
How to implement a four-day week in tech
In-depth More companies are switching to a four-day week as they look to balance employee well-being with productivity
-
Intelligence sharing: The boost for businesses
In-depth Intelligence sharing with peers is essential if critical sectors are to be protected
-
‘LaMDA was ChatGPT before ChatGPT’: Microsoft’s AI CEO Mustafa Suleyman claims Google nearly pipped OpenAI to launch its own chatbot – and it could’ve completely changed the course of the generative AI ‘boom’
News In a recent podcast appearance, Mustafa Suleyman revealed Google was nearing the launch of its own ChatGPT equivalent in the months before OpenAI stole the show.
-
Microsoft is doubling down on multilingual large language models – and Europe stands to benefit the most
News The tech giant wants to ramp up development of LLMs for a range of European languages
-
Everything you need to know about OpenAI’s new agent for ChatGPT – including how to access it and what it can do
News ChatGPT agent will bridge "research and action" – but OpenAI is keen to stress it's still a work in progress
-
‘Humans must remain at the center of the story’: Marc Benioff isn’t convinced about the threat of AI job losses – and Salesforce’s adoption journey might just prove his point
News Marc Benioff thinks fears over widespread AI job losses may be overblown and that Salesforce's own approach to the technology shows adoption can be achieved without huge cuts.
-
AI adoption is finally driving ROI for B2B teams in the UK and EU
News Early AI adopters across the UK and EU are transforming their response processes, with many finding first-year ROI success
-
‘The latest example of FOMO investing’: Why the Builder.ai collapse should be a turning point in the age of AI hype
News Builder.ai was among one of the most promising startups capitalizing on the generative AI boom – until it all came crashing down
-
Is ChatGPT making us dumber? A new MIT study claims using AI tools causes cognitive issues, and it’s not the first – Microsoft has already warned about ‘diminished independent problem-solving’
News A recent study from MIT suggests that using AI tools impacts brain activity, with frequent users underperforming compared to their counterparts.
-
‘Agent washing’ is here: Most agentic AI tools are just ‘repackaged’ RPA solutions and chatbots – and Gartner says 40% of projects will be ditched within two years
News Agentic AI might be the latest industry trend, but new research suggests the majority of tools are simply repackaged AI assistants and chatbots.