How much is your data really worth?
Data is incredibly valuable, but it’s not worth anywhere near as much without the analytical infrastructure to make full use of it
Over the last few years, it has become clear that data is an incredibly valuable asset, particularly now the arrival of GDPR has put a focus on how much it could cost companies that don't comply with regulations regarding the protection of personal data. Client databases have become the lifeblood of successful sales strategies, user behaviour data has allegedly been employed successfully to interfere with election outcomes, and just-in-time manufacturing revolves around comprehensive feedback on every stage of the supply chain.
Most of the biggest new companies of the last decade Google, Facebook, Uber, Airbnb, Palantir, Didi Chuxing base their businesses on the value of their data and how they use it. But unlike physical assets such as raw commodities or manufacturing equipment, it's very hard to put a definitive value on the data itself. Whilst the idea that "data is currency" has become something of a catchphrase over the last few years, and blockchain technologies have attempted to weld money and information together algorithmically, the true denomination of this data-currency remains opaque.
Normally, the value of data is not separated from the overall value of a company and its business. The data aspect has generally only been separated out when it has been lost or compromised in some way. For example, in 2017 the consumer credit agency Equifax suffered a data breach involving 143 million users. The resulting class action could cost the company up to $70 billion. But that's only one way of valuing data, revolving around how much losing personal information might damage the user. It doesn't say very much about how much that data might be worth to Equifax's business. With 2.5 quintillion bytes of data being produced every day globally, according to the World Economic Forum, getting a handle on the monetary value of information has become essential.
The problem, however, is that the value of data isn't just a factor of quantity or volume, but how it can be used, and that's based on how the data is structured and analysed, amongst other things. Time is even more influential than it is with hardware that depreciates as it becomes obsolete. Not only does data need to be fresh and up to date; it also has to be delivered at exactly the right time. It may even be collected and used in real time. Internet of Things sensor information controlling an environmental system must be delivered and processed in a timely fashion to maintain settings, whilst data about the locations of taxis, customers wanting to travel, and where they want to go is only useful when it's still true. There's no point sending a taxi when the traveller left 20 minutes ago.
Because the fact that data is valuable is obvious, but the method for working out just how valuable is obscure, there have been serious attempts to quantify this for insurance purposes, albeit a very complex process. Gartner analyst Doug Laney has defined six different ways of valuing data. With financial methods of estimation, data can be valued according to how much it would cost to replace, how much it would fetch if sold, or how much it contributes to everyday revenue. With non-financial methods of estimation, data can be valued depending on its intrinsic worth to the company, its usefulness for business purposes, or how it drives other aspects of the business.
Since Laney's system is aimed at the insurance business, it involves complicated mathematics, marketplace valuation, and revenue modelling to provide as much precision as possible. There are complex formulae revolving around the accuracy of the data, how complete the set is, how easy it is to access, and whether it's a scarce resource nobody else has. Relevance is key, as is how much the data contributes to business objectives and key performance indicators.
This is all quite esoteric, but it highlights the fact that one of the most important underlying factors in data's value comes from the speed and accuracy of its analysis. This is the point where the static ones and zeros of the data itself blend into the hardware infrastructure that stores, delivers, and performs useful operations on that data. The faster data can be delivered and processed, the more relevant it will be to the person accessing it. So, a rapid, reliable hardware platform for your data is an integral part of realising its value.
Technological improvement can enhance the true value of data at every level. The general rule of thumb is that the more data you have, the better, but that leads to massive storage requirements. The cost of solid state disks (SSDs) continues to fall, making all-Flash storage arrays increasingly affordable. These also provide much faster random access than traditional hard disks, but this can be at the expense of hardware longevity. Recently, however, 3D XPoint memory, as used in Intel Optane SSDs, has arrived boasting even faster transactional speed and much better durability than traditional SSDs. This makes it ideal for keeping data safe and readily accessible, enhancing its value.
You also need to know that once data has been loaded into an application server, it is being operated on as quickly as possible. The faster your analytics can be performed, the more valuable that data will be to your business. This necessitates a hardware platform capable of high core density, high processing power per core, and extensibility for when your requirements grow. Intel's Xeon Scalable architecture provides all of these features.
Few would deny that data is becoming one of the most valuable assets many companies have. But on its own, without the hardware platforms to do it justice, data's true value will be reduced or even wasted. There's no point spending a fortune amassing a wealth of information without also spending adequately on the hardware to take full advantage of it. That's the truly wise way to spend the currency you have created with your data.
Unlocking collaboration: Making software work better together
How to improve collaboration and agility with the right techDownload now
Four steps to field service excellence
How to thrive in the experience economyDownload now
Six things a developer should know about Postgres
Why enterprises are choosing PostgreSQLDownload now
The path to CX excellence for B2B services
The four stages to thrive in the experience economyDownload now