IT Pro is supported by its audience. When you purchase through links on our site, we may earn an affiliate commission. Learn more

How much is your data really worth?

Data is incredibly valuable, but it’s not worth anywhere near as much without the analytical infrastructure to make full use of it

Intel person in a data centre

Over the last few years, it has become clear that data is an incredibly valuable asset, particularly now the arrival of GDPR has put a focus on how much it could cost companies that don't comply with regulations regarding the protection of personal data. Client databases have become the lifeblood of successful sales strategies, user behaviour data has allegedly been employed successfully to interfere with election outcomes, and just-in-time manufacturing revolves around comprehensive feedback on every stage of the supply chain.

Most of the biggest new companies of the last decade Google, Facebook, Uber, Airbnb, Palantir, Didi Chuxing base their businesses on the value of their data and how they use it. But unlike physical assets such as raw commodities or manufacturing equipment, it's very hard to put a definitive value on the data itself. Whilst the idea that "data is currency" has become something of a catchphrase over the last few years, and blockchain technologies have attempted to weld money and information together algorithmically, the true denomination of this data-currency remains opaque.

Normally, the value of data is not separated from the overall value of a company and its business. The data aspect has generally only been separated out when it has been lost or compromised in some way. For example, in 2017 the consumer credit agency Equifax suffered a data breach involving 143 million users. The resulting class action could cost the company up to $70 billion. But that's only one way of valuing data, revolving around how much losing personal information might damage the user. It doesn't say very much about how much that data might be worth to Equifax's business. With 2.5 quintillion bytes of data being produced every day globally, according to the World Economic Forum, getting a handle on the monetary value of information has become essential.

The problem, however, is that the value of data isn't just a factor of quantity or volume, but how it can be used, and that's based on how the data is structured and analysed, amongst other things. Time is even more influential than it is with hardware that depreciates as it becomes obsolete. Not only does data need to be fresh and up to date; it also has to be delivered at exactly the right time. It may even be collected and used in real time. Internet of Things sensor information controlling an environmental system must be delivered and processed in a timely fashion to maintain settings, whilst data about the locations of taxis, customers wanting to travel, and where they want to go is only useful when it's still true. There's no point sending a taxi when the traveller left 20 minutes ago.

Because the fact that data is valuable is obvious, but the method for working out just how valuable is obscure, there have been serious attempts to quantify this for insurance purposes, albeit a very complex process. Gartner analyst Doug Laney has defined six different ways of valuing data. With financial methods of estimation, data can be valued according to how much it would cost to replace, how much it would fetch if sold, or how much it contributes to everyday revenue. With non-financial methods of estimation, data can be valued depending on its intrinsic worth to the company, its usefulness for business purposes, or how it drives other aspects of the business.

Since Laney's system is aimed at the insurance business, it involves complicated mathematics, marketplace valuation, and revenue modelling to provide as much precision as possible. There are complex formulae revolving around the accuracy of the data, how complete the set is, how easy it is to access, and whether it's a scarce resource nobody else has. Relevance is key, as is how much the data contributes to business objectives and key performance indicators.

This is all quite esoteric, but it highlights the fact that one of the most important underlying factors in data's value comes from the speed and accuracy of its analysis. This is the point where the static ones and zeros of the data itself blend into the hardware infrastructure that stores, delivers, and performs useful operations on that data. The faster data can be delivered and processed, the more relevant it will be to the person accessing it. So, a rapid, reliable hardware platform for your data is an integral part of realising its value.

Technological improvement can enhance the true value of data at every level. The general rule of thumb is that the more data you have, the better, but that leads to massive storage requirements. The cost of solid state disks (SSDs) continues to fall, making all-Flash storage arrays increasingly affordable. These also provide much faster random access than traditional hard disks, but this can be at the expense of hardware longevity. Recently, however, 3D XPoint memory, as used in Intel Optane SSDs, has arrived boasting even faster transactional speed and much better durability than traditional SSDs. This makes it ideal for keeping data safe and readily accessible, enhancing its value.

You also need to know that once data has been loaded into an application server, it is being operated on as quickly as possible. The faster your analytics can be performed, the more valuable that data will be to your business. This necessitates a hardware platform capable of high core density, high processing power per core, and extensibility for when your requirements grow. Intel's Xeon Scalable architecture provides all of these features.

Few would deny that data is becoming one of the most valuable assets many companies have. But on its own, without the hardware platforms to do it justice, data's true value will be reduced or even wasted. There's no point spending a fortune amassing a wealth of information without also spending adequately on the hardware to take full advantage of it. That's the truly wise way to spend the currency you have created with your data.

Is your business ready for IT Transformation? Discover more from Intel here.

Featured Resources

2022 State of the multi-cloud report

What are the biggest multi-cloud motivations for decision-makers, and what are the leading challenges

Free Download

The Total Economic Impact™ of IBM robotic process automation

Cost savings and business benefits enabled by robotic process automation

Free Download

Multi-cloud data integration for data leaders

A holistic data-fabric approach to multi-cloud integration

Free Download

MLOps and trustworthy AI for data leaders

A data fabric approach to MLOps and trustworthy AI

Free Download

Recommended

2022 Magic Quadrant for data integration tools
Whitepaper

2022 Magic Quadrant for data integration tools

22 Nov 2022
Who needs Intel vPro®, An Intel® Evo™ Design, anyway?
Sponsored

Who needs Intel vPro®, An Intel® Evo™ Design, anyway?

18 Nov 2022
Intel unveils Max Series chip family designed for high performance computing
components

Intel unveils Max Series chip family designed for high performance computing

9 Nov 2022
Cloud, infrastructure, and management
Whitepaper

Cloud, infrastructure, and management

9 Nov 2022

Most Popular

Empowering employees to truly work anywhere
Sponsored

Empowering employees to truly work anywhere

22 Nov 2022
Salesforce co-CEO Bret Taylor resigns with cryptic parting message
Business operations

Salesforce co-CEO Bret Taylor resigns with cryptic parting message

1 Dec 2022
The top 12 password-cracking techniques used by hackers
Security

The top 12 password-cracking techniques used by hackers

14 Nov 2022