AWS Bedrock distances firm from Microsoft, Google in generative AI race

Amazon Web Services (AWS) logo at IOT Solution World Congress in Fira, Barcelona
(Image credit: Getty Images)

AWS is making its largest foray into the generative AI space yet with the launch of Amazon Bedrock, a managed service offering customers the “easiest way to build and scale enterprise-ready generative AI applications”.

The hyperscaler unveiled Bedrock on Thursday alongside the launch of two new ‘Amazon Titan’ foundation models (FMs) as it seeks to provide developers with access to “some of the most cutting-edge FMs available today”.

Bedrock will grant customers access to a range of foundation models from third parties including AI21 Labs, Anthropic, and Stability AI. 

Among these is the Jurassic-2 family of multilingual LLMs from AI21, which generate text in Spanish, French, Portuguese, Italian, and Dutch. Alongside this offering is Stability AI’s range of text-to-image foundation models, which includes Stable Diffusion. 

The Amazon Titan large language models (LLMs) form part of AWS’ own internal FMs.

The first of these, AWS revealed, is a generative LLM for tasks such as summarization, text generation - to create a blog post, for example - classification, open-ended Q&A, and information extraction. 

The second is an embeddings LLM that will translate text inputs into numerical representations that “contain the semantic meaning of the text”, AWS added.

“Today we are excited to announce Amazon Bedrock, a new service that makes FMs from AI21 Labs, Anthropic, Stability AI, and Amazon accessible via an API,” said Swami Sivasubramanian, VP for machine learning at AWS.

“Bedrock is the easiest way for customers to build and scale generative AI-based applications using FMs, democratizing access for all builders.” 

AWS enters the generative AI ring

The launch of Amazon Bedrock marks AWS’ most aggressive move in the generative AI space so far and takes a different approach to that seen by competitors such as Google and Microsoft. 

Microsoft has fostered closer ties with OpenAI in recent months and accelerated the integration of platforms such as ChatGPT within core product offerings. 

In January, the firm launched its Azure OpenAI Service which provides cloud customers access to OpenAI models. 

RELATED RESOURCE

Blue whitepaper cover with race track style digital graphics with directional arrows

(Image credit: IBM)

How to help IT manage itself with autonomous operations

Using AI and automation to proactively adapt to business disruptions

DOWNLOAD FOR FREE

The move was hailed as a key differentiator for the tech giant in gaining ground on AWS in the broader cloud market.

Similarly, Google has made significant strides in developing and rolling out its own internal generative AI system, Bard

With Bedrock, AWS looks primed to take a different approach to competitors in the space and capitalize on existing relationships with third parties to augment its generative AI capabilities and product offering. 

Long-term, this appears like it could provide a more developer-friendly approach to generative AI and diversify its offering to prospective customers. 

Competition in the generative AI market continues to heat up, and AWS’ quiet approach in this space has been noted in recent months amid the flurry of announcements from Google and Microsoft.

Sivasubramanian suggested that a key differentiator for AWS moving forward will be the ease of access and use, and the sheer diversity of models available to developers. 

“Bedrock makes the power of FMs accessible to companies of all sizes so that they can accelerate the use of ML across their organizations and build their own generative AI applications because it will be easy for all developers,” Sivasubramanian said. 

He added that Bedrock will be a “massive step” toward democratizing FMs and opening up access to a broader range of models for developers within its own ecosystem. 

Ross Kelly
News and Analysis Editor

Ross Kelly is ITPro's News & Analysis Editor, responsible for leading the brand's news output and in-depth reporting on the latest stories from across the business technology landscape. Ross was previously a Staff Writer, during which time he developed a keen interest in cyber security, business leadership, and emerging technologies.

He graduated from Edinburgh Napier University in 2016 with a BA (Hons) in Journalism, and joined ITPro in 2022 after four years working in technology conference research.

For news pitches, you can contact Ross at ross.kelly@futurenet.com, or on Twitter and LinkedIn.