‘Awesome for the community’: DeepSeek open sourced its code repositories, and experts think it could give competitors a scare
The Chinese AI firm has gone one further than its competitors, though there are limits to what it’s revealed

Challenger AI startup DeepSeek has open-sourced some of its code repositories in a move that experts told ITPro puts the firm ahead of the competition on model transparency.
In a post to X late last month, DeepSeek said it would be open sourcing five of its code repositories in a bid to share what it called its “small but sincere progress with full transparency.”
“These humble building blocks in our online service have been documented, deployed, and battle-tested in production,” DeepSeek wrote.
“As part of the open-source community, we believe that every line shared becomes collective momentum that accelerates the journey,” the firm added.
DeepSeek caused ripples of panic earlier this year when its sudden release raised questions about the value of previously unchallenged US competitors - DeepSeek’s models are competitive with the likes of OpenAI despite costing only a fraction of the price to build.
Its openness also stood in stark contrast to some large proprietary US models, industry analysts noted at the time. Now, DeepSeek has gone even further by promising to open-source the code behind its model. Its only competition in this regard is the likes of Meta’ Llama, which has only open-sourced the weights of its models.
“To be clear, Llama has open weights, not open code - you can't see the training code or the actual training datasets. DeepSeek has gone a step further by open sourcing a lot of the code they use, which is awesome for the community,” Alistair Pullen, co-founder and CEO of Cosine, told ITPro.
Get the ITPro. daily newsletter
Sign up today and you will receive a free copy of our Focus Report 2025 - the leading guidance on AI, cybersecurity and other IT challenges as per 700+ senior executives
“I think DeepSeek can probably feel comfortable giving their competitors a scare by doing stuff others won't do - it does diminish their edge, but they're not wholly a model company,” Pullen added.
How open is DeepSeek?
DeepSeek is out ahead when it comes to open source AI, though that doesn’t mean it’s fully open in the traditional sense of the term. The firm hasn’t open-sourced the entirety of its code, nor key aspects of its model development such as training datasets.
The firm could gain an edge if it decides to share more of its code, according to Peter Schneider, senior product manager at Qt Group.
Code transparency is a big sticking point in terms of security and increased levels could create more community engagement, Schneider told ITPro.
“If they wanted to go the extra mile differentiating themselves, releasing their full training data and methodologies would certainly set a new standard for transparency in the AI race," Schneider added.
Industry positive about open source move
Experts have been largely positive about DeepSeek’s decision, with Pullen saying this move gives users a greater level of control and access. He referenced the ‘reinforcement learning algorithm’ that DeepSeek released, a far less memory intensive approach than others.
“Open-source AI models appeal to users because they offer greater flexibility, fine-tuning capabilities, and fewer vendor restrictions. But beyond that, the real advantage comes from the collective intelligence of the global open-source community,” Dirk Alshuth, cloud evangelist at emma, told ITPro.
“The number of contributors can grow to thousands, which ultimately leads to more robust models, innovative use cases, and applications built on top,” Alshuth said.
Continued transparency on this front will boost community engagement and give DeepSeek a unique selling point when compared to its largely closed-source competitors, Alshuth added.
“DeepSeek’s decision to share some of its AI model code is a welcome step toward greater openness in AI development,” Schneider said.
Open source AI is a tough nut to crack
DeepSeek has pushed the definition of open source AI further, though this is just the latest in an ongoing conversation about how open source is defined in the AI arena when the technology is so fundamentally different from what’s gone before.
Speaking to ITPro at the time of Llama 3’s release in 2024, Open UK CEO Amanda Brock said that current conversations around open source AI are stretching historic definitions of open source to their limits.
While elements of an AI application may be made open under a typical open source license, Brock said, other elements of an AI application may not lend themselves as easily to this definition.
Gradients, or “shades of openness,” could be the solution, Brock added, whereby different elements of an AI application are assigned different licenses.
MORE FROM ITPRO
- DeepSeek R1 has taken the world by storm, but security experts claim it has 'critical safety flaws' that you need to know about
- The DeepSeek bombshell has been a wakeup call for US tech giants
- Looking to use DeepSeek R1 in the EU? This new study shows it’s missing key criteria to comply with the EU AI Act
George Fitzmaurice is a staff writer at ITPro, ChannelPro, and CloudPro, with a particular interest in AI regulation, data legislation, and market development. After graduating from the University of Oxford with a degree in English Language and Literature, he undertook an internship at the New Statesman before starting at ITPro. Outside of the office, George is both an aspiring musician and an avid reader.