Google’s Hugging Face partnership shows the future of generative AI rests on open source collaboration
Closer ties between Google and Hugging Face show the tech giant is increasingly bullish on open source AI development
Google’s new partnership with Hugging Face marks another significant nod of approval from big tech on the potential of open source AI development, industry experts have told ITPro.
The tech giant recently announced a deal that will see the New York-based startup host its AI development services on Google Cloud.
Hugging Face, which describes the move as part of an effort to “democratize good machine learning”, will now collaborate with Google on open source projects using the tech giant’s cloud services.
With such a sizable number of models already available on Hugging Face, the attraction on Google’s end is clear, Gartner VP analyst Arun Chandrasekaran told ITPro, and underlines another example of a major industry player fostering closer ties with high-growth AI startups.
“Hugging Face is the largest hosting provider of open source models on the planet - it's basically GitHub for AI,” he said.
“This is why big firms like Google want to partner with it, because it can leverage the huge range of models available on the platform,” he added.
For Hugging Face, the deal will offer speed and efficiency for its users through Google’s cloud services.
Sign up today and you will receive a free copy of our Future Focus 2025 report - the leading guidance on AI, cybersecurity and other IT challenges as per 700+ senior executives
Many Hugging Face users already use Google Cloud, the firm said, and this collaboration will grant them a greater level of access to AI training and deployment through Google Kubernetes Engine (GKE) and Vertex AI.
Google is also nipping at the heels of AWS with this partnership. Amazon’s cloud subsidiary announced a similar agreement with Hugging Face to accelerate the training of LLMs and vision models in February 2023.
Again, the MO was one of AI altruism. Both Hugging Face and AWS cited a desire to make it “easier for developers to access AWS services and deploy Hugging Face models specifically for generative AI applications".
RELATED RESOURCE
Discover how generative AI provides the technical support to operate successfully
DOWNLOAD NOW
Big tech firms have sharpened their focus on open source AI development over the last year, and there are a number of contributing factors to this, according to Chandrasekaran, especially with regard to driving broader adoption of generative AI.
“Rewind back nine months, and the AI landscape was dominated by closed source LLMs - think OpenAI or, to a lesser extent, Google,” he said.
“Since then, open source models have increased in volume and quality. Increasingly, they’re being used for enterprise because open source content is often licensed for commercial use,” he added. “At the same time, the quality of the actual models are improving.”
While open source models aren’t licensed for commercial use by definition, they are more likely to be. This makes them an attractive consideration for enterprise use, as businesses know they can roll out models freely within companies or on the open market.
Meta, for example, was among the first major tech firms to make a big statement on open source AI development last year with the launch of Llama 2.
Google eyes open source as the ticket to overcoming 'AI obstacles'
With issues of cost and scalability, as well as increasing regulation around the corner, open source seems to answer a lot of the big question marks around AI.
“Why is everyone interested in open source AI? The same reason they’re interested in open source in general,” Chandrasekaran said.
“It allows customizability, ease of use, and adaptability in the face of regulation,” he added.
“It allows customizability, ease of use, and adaptability in the face of regulation,” he added.
Matt Barker, global head of cloud native services at Venafi told ITPro there’s already an existing symbiotic relationship between open source and AI.
“Open source is already inherent in the foundations of AI,” he said. “Open source is foundational to how they run. Kubernetes, for example, underpins OpenAI,” he added.
“Open source code itself is also getting an efficiency boost thanks to the application of AI to help build and optimize it. Just look at the power of applying Copilot by GitHub to coding, or K8sGPT to Kubernetes clusters. This is just the beginning.”

George Fitzmaurice is a former Staff Writer at ITPro and ChannelPro, with a particular interest in AI regulation, data legislation, and market development. After graduating from the University of Oxford with a degree in English Language and Literature, he undertook an internship at the New Statesman before starting at ITPro. Outside of the office, George is both an aspiring musician and an avid reader.
-
How the UK is leading Europe at AI-driven manufacturingIn-depth A new report puts the country on top of the charts in adopting machine learning on the factory floor in several critical measures
-
US data center power demand forecast to hit 106GW by 2035, report warnsNews BloombergNEF research reveals a sharp 36% jump in energy forecasts as "hyperscale" projects reshape the American grid
-
Want to build your own frontier AI model? Amazon Nova Forge can help with thatNews The new service aims to lower bar for enterprises without the financial resources to build in-house frontier models
-
AWS CEO Matt Garman says AI agents will have 'as much impact on your business as the internet or cloud'News Garman told attendees at AWS re:Invent that AI agents represent a paradigm shift in the trajectory of AI and will finally unlock returns on investment for enterprises.
-
AWS targets IT modernization gains with new agentic AI features in TransformNews New custom agents aim to speed up legacy code modernization and mainframe overhauls
-
Moving generative AI from proof of concept to production: a strategic guide for public sector successGenerative AI can transform the public sector but not without concrete plans for adoption and modernized infrastructure
-
Google blows away competition with powerful new Gemini 3 modelNews Gemini 3 is the hyperscaler’s most powerful model yet and state of the art on almost every AI benchmark going
-
Google CEO Sundar Pichai sounds worried about a looming AI bubble – ‘I think no company is going to be immune, including us’News Google CEO Sundar Pichai says an AI bubble bursting event would have global ramifications, but insists the company is in a good position to weather any storm.
-
Box and AWS announce new multi-year AI collaborationNews The agreement includes fresh integrations between Box and AWS, including support for Amazon Quick Suite and Amazon Q Developer
-
Some of the most popular open weight AI models show ‘profound susceptibility’ to jailbreak techniquesNews Open weight AI models from Meta, OpenAI, Google, and Mistral all showed serious flaws