Choosing processors that complete more work with fewer physical servers could help data centers offset their overall power consumption and, ultimately, their carbon footprints.
That’s according to the Vanguard report from 451 Research, the findings of which also concluded that businesses will stand a better chance of offsetting their carbon footprints simply by looking at their on-prem and cloud technologies.
Data centers measure energy efficiency with the power usage effectiveness (PUE) metric which examines the overall performance, including cooling, server room design, renewable energy sources, and the facility's lighting.
In 2019, US data centers consumed approximately 268TWh of energy, representing 6.3% of America’s total energy usage, according to 451 Research, a higher percentage of energy consumption than the entire country of Mexico.
The digital-first journey towards the future enterprise in Europe
The shift to a digital-first Europe
These numbers are continuing to climb, despite new governmental regulations and the corporate introduction of ‘green’ initiatives.
IT operations are a growing contributor to climate change with many in the IT industry now concerned with the environmental impact of their tech stack.
451 Research’s Digital Pulse user survey, for example, found that nearly half of the IT decision-makers it polled said that their IT operations now account for most or all of their environmental impact.
However, the survey also found that modernizing core IT infrastructure and the adoption of public cloud services were the strategies that customers were most commonly adopting to meet their environmental objectives.
John Abbot, principal research analyst at 451 Research, noted in the report: “Simply driving up power consumption to deliver higher performance in new generations of CPUs is unsustainable”.
A potential solution here can be found within processors. Chip manufacturers are breaking new ground with smaller, more powerful, and more intelligent processors, expanding on the traditional x86-64 architecture.
These new workload-specific accelerator chips are used as part of a heterogeneous computing architecture to complement the capabilities of general-purpose CPUs offering new levels of performance and control over consumed wattage.
AMD is one of these companies; in 2020, the chipmaker said its goal was to deliver a 30x increase in energy efficiency by 2025 for AI training and high-performance computing applications running on accelerated compute nodes.
If its goal is achieved, billions of kilowatt hours of electricity can be saved by 2025, according to 451 Research.
“While it takes some effort to leverage these CPU and workload accelerator combinations, the cost savings offered can make it worthwhile for enterprises to adopt them from a purely budgetary standpoint,” Abbot noted in the report.
Considering what technology sits at the heart of the data center – the processor – should be the logical first step, according to AMD.
Choosing a processor that can meet the required level of computational work, but with fewer physical servers, can immediately reduce a data center’s carbon footprint and help reduce the facility's overall power consumption.
Get the ITPro. daily newsletter
Receive our latest news, industry updates, featured resources and more. Sign up today to receive our FREE report on AI cyber crime & security - newly updated for 2023.
ITPro is a global business technology website providing the latest news, analysis, and business insight for IT decision-makers. Whether it's cyber security, cloud computing, IT infrastructure, or business strategy, we aim to equip leaders with the data they need to make informed IT investments.