The CVE system isn’t working – what's next?
With 2025's funding issues underlining key issues with the CVE system, what should businesses be doing to source intelligence about security vulnerabilities?
Sign up today and you will receive a free copy of our Future Focus 2025 report - the leading guidance on AI, cybersecurity and other IT challenges as per 700+ senior executives
You are now subscribed
Your newsletter sign-up was successful
In April 2025, it emerged that US Federal funding for the MITRE Corporation's Common Vulnerabilities and Exposures (CVE) database and related programs was being cut. The move created widespread panic throughout the industry but at the last minute, the program was handed a reprieve.
This led to discussions around the problems with relying on a single source for security vulnerabilities – concerns that are still valid nearly one year on.
It comes as research from Sonatype reveals the National Vulnerability Database (NVD) – the index run by the US National Institute of Standards and Technology (NIS) to provide additional detail about flaws – is not up to scratch.
Scores are often inaccurate and appearing too late, according to the data. Of the 1,552 open source vulnerabilities disclosed in 2025, 64% lacked severity scores from the NVD, the study found. Over last year, there was a mean delay of more than six weeks between disclosure and NVD scoring, with some advisories taking up to 50 weeks.
As attackers continue to take advantage of holes in software, detailed and up to date vulnerability data is crucial. It’s clear the current system isn’t working, so what’s next?
A wake-up call
Experts say the issues in April 2025 highlighted the problems of relying on one source for vulnerability information. The MITRE disruption was “a wake-up call”, says Joe Brinkley, head of offensive security at Cobalt. “It made us question whether we really want the entire industry depending on one single point of failure for vulnerability intelligence.”
Even after funding issues were fixed, the bigger questions were still hanging around, he says: “Is the current model sustainable? Do we need to move toward something more community-driven that doesn't require one central authority to approve everything before we can act?”
Sign up today and you will receive a free copy of our Future Focus 2025 report - the leading guidance on AI, cybersecurity and other IT challenges as per 700+ senior executives
The biggest issue with the CVE/NVD program has been “the lack of structural resilience” – mainly its reliance on US government funding, says João Oliveira, security researcher with Checkmarx Zero. At the same time, the growing backlog of vulnerabilities has continued to impact security teams, says Oliveira. “While this problem did not start in 2025, it has become more apparent than ever with the increase in the number of reported and published vulnerabilities – many of which are of poor quality when initially reported, requiring additional analysis time.”
The NVD CVE Enrichment process hasn’t been able to keep up with the volume of CVEs, and critical vulnerabilities can take days to receive complete analysis, says Oliveira. For security teams that depend on timely CVSS scoring, proper understanding of the affected software and relevant insights into the vulnerability details, delays “create operational risk and slow down remediation when speed matters most”, he says.
Despite its flaws, experts say the CVE program is still important for vulnerability management. “The brief threat of a shutdown in April 2025 showed just how much chaos its loss would create,” says Shane Fry, CTO, RunSafe Security. “Vulnerability management is already difficult. Without the CVE program, response times would slow even further than what we’re seeing now. If defenses slow, global risk rises, and that’s something we can’t afford.”
Funding and governance change
It won’t happen immediately, but many industry experts believe there is a need for funding and governance change. This is the vision being promoted by The CVE Foundation – a coalition of longtime, active CVE Board members – aiming to evolve the program toward a more independent and globally representative model.
At the same time, the Exploit Prediction Scoring System (EPSS) has emerged to measure the real-world impact of vulnerabilities. Developed by a volunteer group of researchers, practitioners, academics and government personnel, this is now “widely accepted as a standard metric for risk-based prioritization”, says Oliveira.
“Unlike KEV, which only catalogs vulnerabilities once exploitation is confirmed, EPSS predicts the likelihood that exploitation activity will be observed related to a particular CVE record, providing a proactive approach to assessing real-world risk.”
However, EPSS relies on machine learning and is inherently probabilistic, which can lead to lower-quality decisions by security teams related to vulnerabilities that have already been observed in the wild, says Oliveira.
Meanwhile, he points out, the CVE program and the NVD have failed to integrate EPSS into CVE records. “Instead, they continue to rely heavily on CVSS base scores alone – which has been shown to poorly reflect real-world risk – and on indicators like KEV. These are not comprehensive enough for the backlog of vulnerabilities, only become useful once exploitation is already confirmed, and are not updated to reflect ongoing changes and trends in risk.”
Grant Robertson, manager at Black Duck, highlights the benefits of a “global unified vulnerability database”. At the moment, alternative sources of vulnerability data include the EU Vulnerability Database, the Japan National Vulnerability Database, the China National Vulnerability Database, Google’s Open Source Vulnerabilities (OSV) database and the GitHub Advisory Database.
In particular, OSV addresses several limitations of the CVE program, particularly around software identification, but it has limitations. It is therefore more of a supplement, rather than a replacement for the CVE and NVD programs, says Oliveira.
Another, the Global CVE Allocation System (GCVE), is an EU-led program established with the explicit aim of providing organizations with a decentralized vulnerability database and reducing reliance on US-based systems.
What’s next for CVE?
The system may change, but for now, firms need to use multiple data sources for vulnerability management to ensure they are up to date. Relying solely on the CVE program for vulnerability data is “not a sound strategy”, says Fry. “Instead, organizations that use a diverse set of vulnerability data sources will have more reliable insight into the flaws they’re affected by.”
Effective vulnerability management today requires a risk-based approach, Oliveira advises. “Metrics such as EPSS and KEV are key for understanding real-world risk, while a timely and comprehensive assessment of vulnerabilities is essential to keep the supply chain secure and uninterrupted.”
The best solution for most enterprises is a vendor-enriched database of vulnerabilities, says Oliveira. Organizations seeking a free and open source solution can leverage an open source vulnerability database such as OSV, he says.
But at the same time, companies need to recognize that vulnerability visibility is imperfect and likely to stay that way, says Fry. One area to invest in is software protection that doesn’t require perfect knowledge of every vulnerability, he suggests. “You have to assume unknown vulnerabilities exist in your software, then deploy security protections that prevent exploitation before patches or CVE scores are available.”
Kate O'Flaherty is a freelance journalist with well over a decade's experience covering cyber security and privacy for publications including Wired, Forbes, the Guardian, the Observer, Infosecurity Magazine and the Times. Within cyber security and privacy, her specialist areas include critical national infrastructure security, cyber warfare, application security and regulation in the UK and the US amid increasing data collection by big tech firms such as Facebook and Google. You can follow Kate on Twitter.


