Amazon Q's foundations separate it from ‘monolithic’ competitor AI tools

Amazon Q logo displayed on a smartphone with Amazon branding in background
(Image credit: Getty Images)

With ChatGPT having celebrated its first birthday this week, industry big hitters such as AWS have proven that AI assistants are coming of age and primed to supercharge operational efficiency across a host of business functions. 

AWS announced the launch of its ‘Amazon Q’ enterprise-grade chatbot earlier this week during its annual re:Invent conference.

The launch of the AI assistant marks the cloud giant’s first major foray into the enterprise AI chatbot space, and could set the firm up for a battle with Microsoft-backed OpenAI and Google in the coming months.

Similar to Microsoft Copilot or Google Bard, the AI assistant will provide natural language-based responses to user prompts.

In the day-two keynote at re:Invent, AWS VP for data and AI, Swami Sivasubramanian, showed attendees practical examples of the chatbot in action. 

This included the ability to support code generation, automated summarization capabilities, and business intelligence insights.

Amazon Q already shows signs of maturity

The cloud giant has been keen to impress that this is a mature, enterprise-ready AI assistant capable of providing support across a wide array of business functions; from HR and IT operations to software development and sales. 

IDC analyst Neil Ward-Dutton told ITPro the launch of Amazon Q represents both a foray into uncharted waters for AWS, as well as an opportunity for the hyperscaler to appeal to new prospective customers.

“I think this is interesting because it takes AWS into a completely different territory,” he said. “This is not their core heritage. Their audience is builders, developers, and IT pros.”

“What they highlighted with Q is that it’s about being almost like an architect adviser, so this is clearly going to make sense to devs and architects and give them that advisory capacity.

“They also say this is for people in HR, or people in customer services or finance or procurement. That kind of internal intelligent assistant thing is completely aimed at people they’ve not really focused on at all.”

He added that this foray into the AI chatbot space and the targeting of new user demographics has only been matched once previously in the company’s history with the launch of the Chime video conference platform.

Given the new ground AWS is treading here, Ward-Dutton believes that AWS will need to actively convince prospective customers to take a chance on the platform compared to industry counterparts who have established reputations in providing such tools.

However, the firm’s long-term position as the leading hyperscaler stands it in good stead. 

“This really takes them into a completely different territory,” he said. “If they want to be successful, they’ve got to try and convince people that they don’t really talk to to take them seriously.”

“I think it's going to take time but I'm not going to count it out because AWS has got a track record of being very successful.”

Building on Bedrock

Part of the potential allure here could be the foundations upon which Amazon Q has been built, Ward-Dutton added.  

The chatbot is underpinned by Amazon Bedrock, the firm’s generative AI model framework that launched officially in September. Bedrock offers customers the choice of multiple in-house and third-party LLMs, such as Anthropic’s Claude 2.1 or Meta’s open source Llama 2 model.

Leveraging the Bedrock framework positions Amazon Q as a unique offering within the broader AI assistant space at present, Ward-Dutton believes. This is because customers adopting the tool will have a variety of model choices that cater to unique business needs.

“How AWS is different here is that its competitors, or some of its competitors, are focused very much on using the one technology,” he explained. “Microsoft is the key example, as they built everything with their partner, OpenAI.”

"But with Bedrock, AWS is explicitly and quite deliberately saying ‘we’re not like that, what we’re going to do is bring lots of kinds of models’. That’s where what AWS is doing is quite different. With Q built on top of that, it leverages multiple different models.


Purple whitepaper cover with image of smiling female worker wearing glasses and carrying a folder and smartphone

(Image credit: AWS)

Discover how you can improve operations for Kubernetes at scale


“It’s not a monolithic thing under the covers.”

The flexibility of choice afforded by Bedrock has been one of the most frequently-highlighted benefits of the platform, both by customers speaking to ITPro during re:Invent and from figures at AWS itself.

Atul DEO, general manager for Amazon Bedrock, told ITPro that providing this level of flexibility makes sense given the nascent stage of AI development among customers.

Many are in the formative stages of their AI journey, meaning they will likely want to tinker and experiment with different models. 

Integration of certain models within Amazon Q will enable customers using the assistant to fine-tune and optimize it based on their unique circumstances.

“We want to give customers access to the most capable models for various combinations of the requirements,” he said.

“But it is also about a lot of tools that they’ll require in conjunction with these models to deliver or build compelling production apps.”

Ross Kelly
News and Analysis Editor

Ross Kelly is ITPro's News & Analysis Editor, responsible for leading the brand's news output and in-depth reporting on the latest stories from across the business technology landscape. Ross was previously a Staff Writer, during which time he developed a keen interest in cyber security, business leadership, and emerging technologies.

He graduated from Edinburgh Napier University in 2016 with a BA (Hons) in Journalism, and joined ITPro in 2022 after four years working in technology conference research.

For news pitches, you can contact Ross at, or on Twitter and LinkedIn.