AWS and Hugging Face partner to ‘democratise’ ML, AI models

A telephoto shot of the AWS logo on a conference floor, hanging from a beam

Amazon Web Services (AWS) and machine learning development firm Hugging Face have announced a long-term partnership with the aim of improving access to machine learning (ML) models and lowering the cost of ML for developers.

The pair have signed a non-exclusive agreement to enable developers to easily create and train next-generation ML models obtained through Hugging Face on AWS cloud, with the aim of democratising the technology.

RELATED RESOURCE

The three keys to successful AI and ML outcomes

Democratised, operationalised, and responsible

FREE DOWNLOAD

AWS will be used as Hugging Face’s preferred cloud provider and empower its developer community with AWS’ range of artificial intelligence (AI) tools.

Users will be able to easily move models available for free on Hugging Face to SageMaker, the AWS-managed machine learning service, through which researchers and developers can build, train, and roll out ML models.

These are hosted on Amazon’s elastic compute cloud (EC2), AWS’ scalable computing service, which automates the long process of training with AWS’ sizeable cloud infrastructure and hardware dedicated to training models at low cost.

Hugging Face identified AWS Trainium, the second-generation chip designed by Amazon for deep learning, and AWS Inferentia accelerators for improved deep learning performance, as two tools that Amazon can uniquely offer its community.

In extending the pair’s existing partnership and working even more closely on training ML models at scale, Hugging Face has expressed hope its open source collection of pre-trained natural language processing (NLP) will be easier to scale and deploy.

The firm also spoke to the current inaccessibility of generative AI, which it intends to address by handing the keys to the large language model (LLM) tech that powers tools such as ChatGPT to its developer community with the backing of AWS.

In a blog post on the announcement, Hugging Face stated that it and AWS will “contribute next-generation models to the global AI community and democratise machine learning.”

Hugging Face has previously released BLOOM, its own open source LLM trained in 46 natural languages, 13 programming languages, and boasting 176 billion parameters making it similar in scale to OpenAI’s GPT-3.

“The future of AI is here, but it’s not evenly distributed,” said Clement Delangue, CEO at Hugging Face.

“Accessibility and transparency are the keys to sharing progress and creating tools to use these new capabilities wisely and responsibly. Amazon SageMaker and AWS-designed chips will enable our team and the larger machine learning community to convert the latest research into openly reproducible models that anyone can build on.”

Hugging Face hosts more than 100,000 free ML models, downloaded more than 1 million times daily, and AWS is already a popular platform for Hugging Face developers.

What is AWS’ place in the AI landscape?

AWS has a long history of supporting development in the AI space. In 2021, Meta chose AWS to help expand its AI services, and the firm has focused great attention on chips like its Inferentia silicon to help customers train ML models.

But in recent months, Amazon has stayed largely out of the discussion around generative AI, even as competitors have publicly embraced the technology.

In addition to Microsoft adding ChatGPT to Azure OpenAI, the company has pinned its hopes on integrating ChatGPT within Bing in the hopes of reaching an improved search experience via the popular chatbot.

Google has also bet big on Bard, its answer to ChatGPT that will be implemented within the search giant’s own platform in the near future.

“Everyone wants to make sure they aren’t behind Microsoft, which demonstrated what can be achieved and the level of interest,” said Bola Rotibi, chief of enterprise research at CCS Insight, to IT Pro.

“There was always going to be an element of competitive jostling in this space, and we’ve seen what has come out of Microsoft’s investment in OpenAI. The strength of working on AWS’ platform, tightening it up in terms of providing and improving the engine power is significant. It means not just working on the cloud, but also working directly with the engineering teams at these companies to get the horsepower which can only be achieved through this partnership.”

“From AWS’ point of view, it’s also a smart and pragmatic move. ‘Accessible AI’ is a good narrative in terms of opening up to the wider community to train these models. As with anything, the devil is in the details; for AWS, having these large models run on their machines allows them to fine-tune them.

“The fact that Hugging Face has announced AWS as the preferred provider, similar to OpenAI’s relationship with Microsoft, can only offer benefits for everybody."

Rory Bathgate
Features and Multimedia Editor

Rory Bathgate is Features and Multimedia Editor at ITPro, overseeing all in-depth content and case studies. He can also be found co-hosting the ITPro Podcast with Jane McCallion, swapping a keyboard for a microphone to discuss the latest learnings with thought leaders from across the tech sector.

In his free time, Rory enjoys photography, video editing, and good science fiction. After graduating from the University of Kent with a BA in English and American Literature, Rory undertook an MA in Eighteenth-Century Studies at King’s College London. He joined ITPro in 2022 as a graduate, following four years in student journalism. You can contact Rory at rory.bathgate@futurenet.com or on LinkedIn.