Dell Technologies World 2025: All in on AI

From the cloud to the edge, enterprise architecture for AI must be carefully considered

The text "Dell Technologies World 2025: All in on AI" on an artistically-blurred photo of the Las Vegas Sphere displaying Dell imagery. The words "Dell Technologies World 2025:" are in yellow while the rest are in white. The ITPro Podcast logo is shown in the bottom right corner.
(Image credit: Jane McCallion/Future)

Dell Technologies World has just come to a close in Las Vegas and there is no doubt that the company is – in its own words “all in on AI”. From laptops to services, data center infrastructure and partnerships, everything is being led by AI.

What this means in practice for IT decision makers and business leaders can sometimes be hard to divine, however. This week, Jane sits down with John Roese, chief technology officer and chief AI officer at Dell Technologies, to dig into the practical effects of this, from how businesses will think about endpoints and devices to the potential end of HCI.

Highlights

“[B]ecause we had a cloud infrastructure, we assumed that, well, maybe we'd use that for AI. And what we learned over the last two years is they are so different. They are so architecturally different, they have a such a different relationship with data, that the idea that five years ago, with no knowledge of AI, the decision you made as an enterprise to describe your IT infrastructure would be a perfect fit for a technology that emerged five years later that will transform the world, is insane.”

“You're going to need edge compute nodes and we've been doing edge for a long time, but in the last year and a half, I will tell you the dominant use of edge compute is AI now. There’s very few people running other stuff on edge compute, you're running computer vision models, interpretation models. It's not necessarily generative AI, but you're doing AI there in factories and hospitals and transportation networks.

“Training of AI models requires, actually, surprisingly less information than you'd think. But it requires the most incredible amount of compute you could imagine. Inference, on the other hand, funnily enough if you're doing RAG, is actually a lot lighter on compute but the transactions into the vector databases and the data systems are quite high. And so you end up with this kind of asymmetric behavior all over the place.”

Footnotes

Subscribe 

Jane McCallion
Managing Editor

Jane McCallion is Managing Editor of ITPro and ChannelPro, specializing in data centers, enterprise IT infrastructure, and cybersecurity. Before becoming Managing Editor, she held the role of Deputy Editor and, prior to that, Features Editor, managing a pool of freelance and internal writers, while continuing to specialize in enterprise IT infrastructure, and business strategy.

Prior to joining ITPro, Jane was a freelance business journalist writing as both Jane McCallion and Jane Bordenave for titles such as European CEO, World Finance, and Business Excellence Magazine.