How edge AI can boost the bottom-line – and how to get ready

A smart edge AI computing strategy can help businesses more efficiently process AI workloads and eliminate processing delays – and selecting the right devices is a critical step

A blue background with an unbranded laptop open and the words The future of computing is here: Premium features built for the modern workplace
(Image credit: Datacom)

As small to midsize businesses plot strategies to effectively employ artificial intelligence, they may quickly determine that investing in an edge AI computing strategy is just as important as choosing the best cloud-based AI offerings. The right choice can future-proof your AI strategy, reduce operational costs, and provide the kind of security that is increasingly important in the AI age.

Edge AI enables the execution of powerful AI workloads locally on a user's own laptop, while delegating some tasks to the cloud. “It’s a highly efficient way to put AI to work, because it allows models to operate swiftly and autonomously, delivering immediate insights even without reliable cloud connectivity,” says Greg Furlong, associate director of strategy and innovation at Datacom, an Australian IT solution provider.

A new generation of AI PCs makes edge AI possible. With powerful neural processing units (NPUs) that offload AI tasks from CPUs and GPUs, AI PCs can handle numerous AI applications locally. Such tasks include real-time transcription and translation, document summarization, AI-assisted image and video processing, security, and more. It makes employees more productive while saving bandwidth and energy, in part due to the power-saving design of AI PCs.

Gartner expects AI PCs to account for 31% of the global market in 2025, with 77 million units shipped, and a 55% share in 2026. “By 2029, AI PCs will become the norm,” Gartner said.

Benefits of edge AI

An ecosystem of solutions to enable edge AI is also emerging. Microsoft Foundry Local, for example, is a version of Azure AI Foundry that enables execution of large language models (LLMs) on Windows devices. Those devices may include Microsoft Surface laptops built on an ARM-based architecture and optimized for Windows 11.

That means companies can deploy AI applications that perform numerous functions locally and communicate with cloud-based Foundry resources as needed.

Such a strategy delivers several benefits, including:

  • Lower latency: Making calls to the cloud for every step required in an AI (or any) application introduces latency. AI PCs make it possible to perform numerous tasks locally, in milliseconds. Besides increasing productivity and lowering the costs associated with wide-area bandwidth, such performance is critical for time-sensitive applications, such as financial transactions and safety-critical applications.
  • Increased security and privacy: Relying solely on cloud-based AI offerings means sending potentially sensitive data outside of your organization to a cloud provider. By enabling more local processing, edge AI helps alleviate that concern. "When your most sensitive data doesn't have to leave your device, that greatly reduces the chances of it being stolen or used inappropriately," Furlong says.
  • Addressing shadow AI: Edge AI also helps address the problem of "shadow AI," where employees use AI applications without IT's knowledge, which can have unintended consequences. "Users may be unknowingly uploading or exposing sensitive data to these AI platforms [or] training AI models with corporate information," says David Stafford-Gaffney, Datacom's Associate Director of Cybersecurity.
  • Improved productivity and resiliency: The ability to run applications such as inference locally means some key functions can continue to operate even when an internet connection is unavailable. That's a boon for road warriors who don't always have reliable connectivity, as well as for weathering outages in internet connectivity.
  • Lower costs: In addition to reducing bandwidth requirements, the more AI processing companies can perform locally, the more they save on any usage-based charges for cloud-based AI services.
  • Energy efficiency: Devices with NPUs are optimized not only for AI inference but also for energy efficiency. Microsoft Surface laptops, for example, are based on an ARM architecture that improves efficiency vs. relying on CPUs and GPUs. They even deliver day-long battery life while using AI capabilities.

A sample of edge AI applications

The greatest benefit, of course, comes from the applications that edge AI enables.

Datacom’s research found that 50% of surveyed employees use AI features at work, and 91% of employers say they encourage employees to use AI for regular work tasks. The business case for using AI is clear, with 74% of AI users citing time-saving benefits and 56% noting increased productivity.

Examples include AI assistants that can perform all sorts of functions. That includes retrieval-augmented generation (RAG), which gives AI models access to a company’s own data or relevant third-party data sources. Think ChatGPT, but it draws on your own customer database instead of the entire internet, for example.

Edge AI enables document summarization and audio/video transcription applications to be performed locally, on user devices. Microsoft Copilot, for example, can summarize long business documents, such as contracts and email threads. It can also generate meeting transcripts and summaries, complete with key highlights and action items.

AI applications like Copilot can also assist users in drafting personalized emails, reports, proposals, and presentations.

For healthcare providers, AI PCs can perform imaging applications, such as finding anomalies in MRI and CT scans. That enables clinicians to more rapidly treat clear-cut cases while sending scans that require more intensive analysis to the cloud.

In industrial environments, edge AI can play a key role in predictive maintenance applications, running real-time anomaly detection and predictive algorithms locally while syncing as needed with their cloud-based counterparts.

Security from chip to cloud

Security is another important application. AI PCs, including Microsoft Surface, offer security features built into hardware to defend against physical tampering, validate firmware and drivers, and more. Hardware-based encryption and security identity modules protect data even if the device is stolen or infiltrated by attackers.

Such features complement the security features offered by Microsoft Defender, such as the antivirus capabilities built into Windows 11 and various Microsoft Defender cloud-based offerings, providing security “from chip to cloud,” as Microsoft says.

As SMBs look to refresh PCs in coming months and years, they would do well to consider AI PCs. As Datacom’s Furlong says, “If you believe, as I do, that we’ll be running AI at the edge extensively in the next three to four years, it makes sense to future-proof your hardware now.”

Learn how Datacom can help you map out your AI future by visiting the Datacom partner page.

ITPro

ITPro is a global business technology website providing the latest news, analysis, and business insight for IT decision-makers. Whether it's cyber security, cloud computing, IT infrastructure, or business strategy, we aim to equip leaders with the data they need to make informed IT investments.

For regular updates delivered to your inbox and social feeds, be sure to sign up to our daily newsletter and follow on us LinkedIn and Twitter.