AWS just launched an AI Center of Excellence to accelerate partner AI adoption

An illuminated AWS sign hanging from the ceiling of a conference center
(Image credit: Getty Images)

AWS has unveiled a new generative AI Center of Excellence (CoE) aimed at widening the availability of AI tools and resources for partner organizations. 

The center, which is open to customers within the AWS Partner Network, will offer access to an “AI playbook” which includes educational and training resources to help drive adoption and unlock value from the technology. 

The CoE will host both technical and non-technical content that AWS said “showcases the latest developments and insights around generative AI”. 

Resources include industry-focused training, thought leadership insights on the use of the technology, and use-case specific best practices. 

“The Generative AI Center of Excellence for AWS Partners includes guided, optimized learning paths, along with curated and interactive curricula,” AWS said in a statement. 

“The CoE will be complemented by forums that facilitate AWS and partner knowledge sharing and development of joint thought leadership, collectively advancing the thinking and applications of generative AI for our joint customers.”

As part of the move, the CoE will include contributions from a host of generative AI solutions providers, including Anthropic, Cohere, and Nvidia. 

The center will also provide support for partners through collaboration with consultancies including McKinsey and Boston Consulting Group (BCG). 

Julia Chen, vice president, partner core at AWS, said the launch of the CoE comes in direct response to heightened interest in the use of generative AI tools among AWS partners. 

However, the rapid acceleration of generative AI in recent months has presented challenges with regard to adoption, with many businesses struggling to integrate tools at pace.

“We’ve all seen the far-reaching industry and functional applications of generative artificial intelligence that has captured widespread attention and an urgency for customers to transform their businesses,” she said. 

“Generative AI’s nascency and pace of innovation means there is an increasing demand for specialized capabilities at even the most mature enterprises. The need to address specialized requirements has prevented many businesses from quickly implementing and realizing business outcomes.”

AWS draws on partner AI expertise

Use-cases and tangible examples of generative AI being used effectively will be provided by AWS partners through the CoE, the company said. 

This will include training and insights on the use of third-party foundation models, applications, and developer tools. AWS said the scheme will help develop a more detailed understanding of how to maximize the use of foundation models. 

Educational resources from Anthropic and Cohere will cover the core features, capabilities, and best practices of leveraging FMs through Amazon’s Bedrock framework.  

The Bedrock service provides AWS customers with access to a host of third-party FMs, as well as the company’s own internal models such as Titan. 

RELATED RESOURCE

Managing Data for AI and Analytics at Scale with an Open Data Lakehouse Approach: IBM watsonx.data whitpaper

(Image credit: IBM)

Get introduced to IBM’s new toolkit for AI governance

DOWNLOAD NOW

Having launched in April 2023, the service has proved highly popular among customers and partners. In July, AWS’ Swami Sivasubramanian revealed Bedrock had attracted “thousands of customers” since its launch.  

“We are truly at an exciting inflection point in the widespread adoption of machine learning. We are still in the early days, with a need for continuous investment in and development of generative AI talent and organizational capabilities,” said Vasi Philomin, VP of Generative AI at AWS. 

“Through the CoE, we are deepening the integration between AWS and partner teams by bringing our internal-facing content directly to partners in a timely manner, which is critical in the fast-paced, highly evolving domain of generative AI.”

The launch of the CoE follows speculation this week that Amazon is working on a new 2 trillion parameter large language model, dubbed ‘Olympus’, as the company looks to mount a direct challenge to OpenAI’s GPT-4 model.  

AWS has ramped up investment in generative AI services across 2023. The launch of the CoE could complement the surge of customers flocking to its Bedrock service and provide vital training resources for organizations still in the embryonic stages of their AI adoption journey. 

Ross Kelly
News and Analysis Editor

Ross Kelly is ITPro's News & Analysis Editor, responsible for leading the brand's news output and in-depth reporting on the latest stories from across the business technology landscape. Ross was previously a Staff Writer, during which time he developed a keen interest in cyber security, business leadership, and emerging technologies.

He graduated from Edinburgh Napier University in 2016 with a BA (Hons) in Journalism, and joined ITPro in 2022 after four years working in technology conference research.

For news pitches, you can contact Ross at ross.kelly@futurenet.com, or on Twitter and LinkedIn.