AWS targets easier AI adoption at AWS Summit London

Attendees walk through an expo hall during AWS re:Invent 2022.
(Image credit: Getty Images)

AWS is shoring up its position in the AI landscape by setting its sights on making  AI adoption easier for enterprises, by acting as both a supplier and a developer of enterprise AI tools

The keynote talks at AWS Summit London made this mission clear, with executives from the company taking to the stage to tout AWS’ impressive AI-ready infrastructure.

At just an hour long, the keynote address was short but sweet, and the firm illuminated some of the ways it's speeding up the process of AI adoption.

“All these tools are about accelerating adoption of technology,” Tanuja Randery, VP and managing director of EMEA at AWS Europe, tells the event’s crowded auditorium.

How AWS is preparing itself to accelerate AI adoption can be understood through two of its key offerings - the foundational support of its AI model platform Amazon Bedrock and its enterprise-grade chatbot Amazon Q.

While the firm positions itself as the industry middleman with Bedrock, granting customers access to big-name models, it also displays its independence by showing off a powerful homegrown solution.

While there was little in the way of a fresh release at AWS Summit London, the audience certainly left with a better sense of the AWS products already making headway in the market.

Speaking to a packed auditorium – so packed that many attendees weren’t even able to make it inside for the keynote – AWS makes a subtle but effective show of strength, leveraging its immense infrastructure to show how readily it can capitalize on the tangible business interest in generative AI.

AWS wants to be the industry middleman 

As the largest public cloud provider on the planet and a firm well-equipped to offer its customers powerful solutions, AWS could rely on simply reminding those in attendance of its own cloud prestige.

“We were a pioneer of the cloud, turning technologies like networking and storage, databases, and computing into programmable resources,” Francessca Vasquez, VP of professional services and the generative AI Innovation Center, reminds the audience. 

It’s in this vein that Vasquez begins to talk about Amazon Bedrock, AWS’ managed service platform for AI which comes equipped with foundation models from the likes of Mistral and Meta.

The attraction of Bedrock is obvious. It makes the process of using generative AI far simpler for enterprises, as companies gain access to a huge range of AI models for them to choose from, depending on the use case.


“It is the easiest way to build and scale generative AI applications with large language models (LLMs) and other foundation models,” Vasquez says

“Customers in virtually every single industry are using Amazon Bedrock to reinvent their user experiences, products, and processes,” she adds.

AWS has put a lot of effort into making Amazon Bedrock as appealing a package as possible. Since the platform was first unveiled AWS has emphasized customer choice and stressed the importance of providing a wide range of third-party LLMs including several high-profile open LLMs.

Amazon Bedrock includes models from the likes of Anthropic, into which AWS has invested $4 billion, including its new ChatGPT-challenging flagship Claude 3. Just before AWS Summit London the firm announced Meta’s Llama 3, the strongest open-source model on the market, has been added to Bedrock.

“We at AWS believe no one model will rule them all - we're still in the early days of generative AI, and these models will continue to evolve at unprecedented speed,” Vasquez says.

“That's why customers need the flexibility to use different models at different times,” she says.

Here, flexibility gives AWS the edge, allowing both the firm and its customers to hedge their bets in the generative AI race. Neither AWS nor those enterprises using it need to worry about committing to an existing AI startup.

AWS has also tapped into one of the fundamental pain points of generative AI in the enterprise, namely that leaders often don’t know how to implement it or which AI instances would work best for them. That’s why Bedrock is so attractive, as it offers a solution to that anxiety.

“In 2023, I saw a lot of companies really grappling with generative AI,” Vasquez says. “They could see the potential and began experimenting. But it's hard. It's hard to move experimental proof of concepts into production.”

“We can build services so that you can make this leap,” she adds, “we are making it easy for you to be able to build and scale generative AI”.

AWS shores up its own AI offerings 

It’s important to remember that AWS was also very much showcasing its own generative AI chatbot, Amazon Q, which it celebrated in tandem with the offerings of Bedrock. 

AWS’ approach, Vasquez explains, is to look at the three key layers of the generative AI stack, which includes the bottom layer of training infrastructure and the middle layer of Amazon Bedrock. 

“At the top layer, we build applications by leveraging foundation models and LLMs so that you can take advantage of generative AI quickly and without any specialized knowledge,” Vasquez says.

This want of “specialized knowledge” is key, as the firm commits itself to breaking down this barrier to AI adoption. AWS’ customers want to reap the rewards of generative AI “quickly and easily”, Vasquez says, but “without the need for any machine learning (ML) expertise”.

Offering aid in “every single step of the development lifecycle”, Amazon Q is the firms answer to enterprise AI. It combines data analysis and natural language input in a single source of information that suits any level of AI expertise.

After illuminating some of the tool's possible use cases, Vasquez is also eager to mention just how well Amazon Q works with Bedrock, which allows generative AI to be added to any application through its unified application programming interface (API).

In the first instance, AWS is making AI more accessible through Bedrock. In the second, its attention to accessibility at the generative AI use level through Amazon Q.

How successful this steady approach to AI will be will rest entirely on its customers. Landing somewhere in between the focused, OpenAI-reliant approach of Microsoft and ‘AI everywhere’ approach of Google Cloud, AWS has fallen back on its established cloud dominance and wide model garden.

George Fitzmaurice
Staff Writer

George Fitzmaurice is a staff writer at ITPro, ChannelPro, and CloudPro, with a particular interest in AI regulation, data legislation, and market development. After graduating from the University of Oxford with a degree in English Language and Literature, he undertook an internship at the New Statesman before starting at ITPro. Outside of the office, George is both an aspiring musician and an avid reader.