AWS has announced the general availability of its AI platform Amazon Bedrock, allowing businesses to utilize a wide range of foundation models, alongside promises of more powerful models and code customization in the coming weeks.
Amazon Bedrock is AWS’ platform for scaling generative AI. Customers can pick and choose the best models for their use cases - whether that means putting one model through a specific task or using a combination of different models in tandem.
Customers can choose from options including AI21 Labs’ complex text processing model Jurassic 2, Stability AI’s image generation model StableDiffusion, Anthropic’s Claude chatbot, and Amazon’s own family of Titan foundation models.
AWS has committed to widening the scope of its AI offerings through trusted partners and announced that in the coming weeks, Meta’s powerful open LLM Llama 2 will also be added to Bedrock.
AWS will be the first cloud provider to offer Llama 2 via an API, stating that it has optimized the model for its infrastructure to reduce response times.
Enterprise customers that use AWS CodeWhisperer, the firm’s generative AI pair programmer, can also benefit from improved detail in the coming weeks. AWS has announced a new enterprise tier for the service, which will allow customers to securely link it to their private code base.
This will allow customers to leverage more relevant suggestions from CodeWhisperer, rather than relying on more generalized suggestions based only on its training data. It could be a key quality to help CodeWhisperer stand out from its competitors, including GitHub Copilot X and Meta’s free Code Llama.
Alongside the Bedrock platform, AWS has made Amazon Titan Embeddings generally available. This is one of three Amazon Titan foundation models (FMs), AWS’ own pre-trained models, which can collectively perform tasks such as text generation, summarization, and text retrieval.
Game on: Delivering a secure and seamless player experience
Learn how to give your users the confidence that their favourite games will be secure, available, and lightning-fast in this webinar from Cloudflare.
DOWNLOAD FOR FREE
Titan Embeddings is specifically designed to translate text into ‘embeddings’, a machine learning (ML) term for a numerical vector assigned to a specific data value. Through this process, businesses can securely and efficiently connect FMs to external information such as proprietary data to inform model output.
Titan Embeddings can translate text from over 25 languages in chunks of up to 8,000 tokens at a time, enough to process large swathes of text or documents. The other two Titan FMs, Titan Text Express and Titan Text Lite, remain in preview.
“Over the last year, the proliferation of data, access to scalable compute, and advancements in machine learning have led to a surge of interest in generative AI, sparking new ideas that could transform entire industries and reimagine how work gets done,” said Swami Sivasubramanian, vice president of Data and AI at AWS.
“Today’s announcement is a major milestone that puts generative AI at the fingertips of every business, from startups to enterprises, and every employee, from developers to data analysts.
“With powerful, new innovations AWS is bringing greater security, choice, and performance to customers, while also helping them to tightly align their data strategy across their organization, so they can make the most of the transformative potential of generative AI.”
AWS has already drawn in thousands of new customers through Bedrock, and has doubled down on its collaboration with its multitude of developer partners. On September 25, AWS announced $4 billion investment in Anthropic, the developer of the safety-focused Claude chatbot, an act of confidence in the company and the wider Bedrock offering.
Under the deal, which will see AWS invest an initial $1.25 billion in Anthropic, the two companies will work together to improve Anthropic’s offerings using AWS infrastructure.
Cloud Pro Newsletter
Stay up to date with the latest news and analysis from the world of cloud computing with our twice-weekly newsletter
Rory Bathgate is Features and Multimedia Editor at ITPro, overseeing all in-depth content and case studies. He can also be found co-hosting the ITPro Podcast with Jane McCallion, swapping a keyboard for a microphone to discuss the latest learnings with thought leaders from across the tech sector.
In his free time, Rory enjoys photography, video editing, and good science fiction. After graduating from the University of Kent with a BA in English and American Literature, Rory undertook an MA in Eighteenth-Century Studies at King’s College London. He joined ITPro in 2022 as a graduate, following four years in student journalism. You can contact Rory at email@example.com or on LinkedIn.