Presented by AMD
Open source: Why open ecosystems matter
Driving success with AI will require an industry-wide collaborative approach spanning the software and hardware domains


Open source is now very much mainstream. No longer confined to the fringes of the global technology industry, enterprises of all sizes have embraced open source software, tools, and solutions over the last decade.
Moreover, with the advent of the generative AI ‘boom’ in late 2022, the open source ecosystem has never been more important. While the industry trend was initially sparked by businesses pursuing a proprietary approach to the technology, open source options have become increasingly popular.
Analysis from OpenUK, a trade body for the open source ecosystem, shows that around 20,000 businesses have used open source AI in the past year, with a significant increase in the last three months.
Take Meta’s Llama model range, for example. While enterprises initially flocked to tools such as OpenAI’s ChatGPT, backed by big investment from Microsoft, Meta’s open source models have surged in popularity.
Indeed, March this year saw the tech giant surpass the one billion download mark for its Llama models, which now boast an array of global businesses such as Spotify, as key users.
Meta isn’t alone in this strategy, either. On the other side of the Atlantic, French-based startup Mistral has rapidly positioned itself as a key player in the global generative AI space, again by pursuing an open source approach.
There are a number of key contributing factors to the popularity of open source in the AI era, particularly from a flexibility and financial perspective.
Sign up today and you will receive a free copy of our Future Focus 2025 report - the leading guidance on AI, cybersecurity and other IT challenges as per 700+ senior executives
The reality is that investment in AI is costly and raises serious issues for enterprises pursuing adoption strategies. The cost of proprietary solutions can be jarring and simply not viable for many enterprises, especially smaller ones. Similarly, the investment required to modernize underlying infrastructure and storage capabilities proves equally daunting.
Combine this with internal skills gaps, and what enterprises face is a confluence of challenges that limit their ability to fully capitalize on the technology and unlock tangible business benefits.
In contrast, open source has proven highly beneficial for businesses pursuing this approach. Analysis from IBM and Morning Consult, for example, found that companies harnessing open source AI tools are more likely to see a positive return on investment (ROI) than those using proprietary systems.
Indeed, the survey found two-in-five respondents not already using open source options plan to embrace this approach in the next year.
The infrastructure underpinning AI is crucial
Of course, open source has its benefits from a software and tooling perspective, but the hardware side is equally important. It’s the foundation upon which any successful AI strategy is built.
Leading providers in this domain, such as AMD and Nvidia have made significant headway with regard to open source options and ecosystem commitments.
Nvidia, for example, contributes to a raft of open source projects such as the Linux Kernel and PyTorch. The GPU provider has also committed to supporting a range of open source development projects in recent years.
AMD has made equally important contributions and commitments on this front, such as the Radeon Open Compute platform (ROCm). This is an open source platform aimed at supporting high-performance computing (HPC) requirements and machine learning workloads for GPUs.
Notably, AMD has a dedicated open ecosystem strategy. It’s an approach ITPro noted ahead of attending the company’s Advancing AI event in San Francisco in late 2024.
The company has focused heavily on an ‘open standards’ approach to AI in recent years as part of a strategy spearheaded by chief executive Lisa Su. This has seen the company make acquisitions with open source in mind, including a deal for AI software company Nod.ai in October 2023.
At its most recent Advancing AI event in June 2025, the company once again doubled down on this commitment to open standards with open rack-scale infrastructure improvements.
Indeed, during her keynote speech at the event, AMD’s CEO Lisa Su made several references to the importance of maintaining a focus on openness.
Su told attendees that the tech giant is “investing heavily in an open, developer-first ecosystem," which in practice means it is "really supporting every major framework, every library, and every model to bring the industry together in open standards so that everyone can contribute to AI innovation".
Why open ecosystems matter
Hardware providers focusing on open ecosystem development will be vital in the coming years as the industry becomes increasingly interwoven, especially with recent developments in Model Context Protocol (MCP) standards.
This open source framework, first introduced by Anthropic in late 2024, aims to standardize how large language models (LLMs) and AI systems integrate with both data sources and external tools.
It marks a step change in how enterprise AI customers interact with the technology and providers, allowing for greater freedom of choice and flexibility, both key tenets of the open source community.
With AI adoption rates expected to continue rising moving forward, the collaborative nature of the open source ecosystem and open standards approaches to the technology will be vital in supporting development.
Juan José López Murphy, head of data science and AI at Globant, told ITPro the ecosystem’s long-standing reputation as a collaborative environment is a major draw.
“The open source ecosystem is central to this momentum,” he said. “It’s not just a collection of tools, it’s a living, collaborative network. People are sharing best practices, contributing improvements, and helping each other move faster.”
“This community-driven innovation is what allows open source AI to evolve so quickly,” Murphy added.
“That’s what keeps the ecosystem dynamic and ensures it continues to reflect values like openness, transparency, and shared progress even as AI becomes more tightly integrated into the commercial world.”
David Weinstein, co-founder and CEO at KayOS, echoed Murphy’s comments on this front, describing the open source community as a “force multiplier”.
“Open ecosystems foster collaborative thinking,” he said. “They normalize public iteration, good-faith disagreement, and building upon each other’s breakthroughs.”

Ross Kelly is ITPro's News & Analysis Editor, responsible for leading the brand's news output and in-depth reporting on the latest stories from across the business technology landscape. Ross was previously a Staff Writer, during which time he developed a keen interest in cyber security, business leadership, and emerging technologies.
He graduated from Edinburgh Napier University in 2016 with a BA (Hons) in Journalism, and joined ITPro in 2022 after four years working in technology conference research.
For news pitches, you can contact Ross at ross.kelly@futurenet.com, or on Twitter and LinkedIn.
-
What is parallel processing?
It’s the backbone of the internet and supercomputing – here’s everything you need to know about parallel processing
-
Everything you need to know about OpenAI’s new agent for ChatGPT
News ChatGPT agent will bridge "research and action" – but OpenAI is keen to stress it's still a work in progress
-
What is parallel processing?
It’s the backbone of the internet and supercomputing – here’s everything you need to know about parallel processing
-
AMD Advancing AI 2025: Racks, openness, and the spectre of Nvidia
Opinion Can the chipmaker really step out of the market leader's shadow?
-
Meta faces new ‘open washing’ accusations with AI whitepaper
News The tech giant has faced repeated criticism for describing its Llama AI model family as "open source".
-
What is exascale computing? Exploring the next step in supercomputers
60 years after the birth of the first supercomputers, we are entering a new era
-
AMD has put in the groundwork for a major AI push while the tech industry has fawned over Nvidia
Analysis The processor giant will be keen to use its AMD Advancing AI Event in San Francisco to capitalize on recent successes
-
Empowering enterprises with AI: Entering the era of choice
whitepaper How High Performance Computing (HPC) is making great ideas greater, bringing out their boundless potential, and driving innovation forward
-
AMD’s acquisition spree continues with $665 million deal for Silo AI
News The deal will enable AMD to bolster its portfolio of end-to-end AI solutions and drive its ‘open standards’ approach to the technology
-
AMD retains its position as the partner of choice for the world’s fastest and most efficient HPC deployments
Supported content AMD EPYC processors and AMD Instinct accelerators have been used to power a host of new supercomputers globally over the last year