What is cache memory?
We explain the different categories of cache memory and how it differs from RAM
Memory is a core element of all computer systems, and there are many different types that work together in order to keep a machine running smoothly. While some memory types are tuned for short-term tasks and others for longer-term retention, all are critical for the operation of software and hardware.
Although the term 'memory' is usually associated with information storage, there are memory components that operate beyond this remit. This includes the encoding and retrieval of data, a critical function of cache memory.
On its own, cache memory is not that useful, however, when interacting with other components it becomes a critical part of any system.
Cache memory allows computer operations to hold recently-accessed data in place for repeated use, rather than software having to use the same set of instructions in rapid succession to use the data. The larger the capacity of the cache, the more data it can store and, subsequently, the quicker it operates.
Cache memory vs RAM
You might be thinking that the storage of temporary data for operations sounds a lot like random-access memory (RAM). While the two are similar in principle, there are some notable differences.
Technically, cache memory differs from RAM in the sense that the former is storing data for future operational requests, so those operations can perform straight away. RAM, however, stores application and operational data that's currently in use.
The second difference is speed. Cache memory sits much closer to the central processing unit (CPU) compared to RAM, which is important as it means cache memory is generally much faster to access.
Also, cache memory only stores commonly used data that the CPU relies on for future operations. This means the maximum size of cache memory is generally much smaller than RAM – with cache measured in kilobytes rather than gigabytes.
Cache memory types
Cache memory can be complicated, however; not only is it different to the standard DRAM that most people are familiar with, but there are also multiple different kinds of cache memory.
Cache memory generally tends to operate in a number of different configurations: direct mapping, fully associative mapping and set associative mapping.
Direct mapping features blocks of memory mapped to specific locations within the cache, while fully associative mapping lets any cache location be used to map a block, rather than requiring the location to be pre-set. Set associative mapping acts as a halfway-house between the two, in that every block is mapped to a smaller subset of locations within the cache.
Cache memory grading
There are three different categories, graded in levels: L1, L2 and L3. L1 cache is generally built into the processor chip and is the smallest in size, ranging from 8KB to 64KB. However, it's also the fastest type of memory for the CPU to read. Multi-core CPUs will generally have a separate L1 cache for each core.
L2 and L3 caches are larger than L1, but take longer to access. L2 cache is occasionally part of the CPU, but often a separate chip between the CPU and the RAM.
Graphics processing units (GPUs) often have a separate cache memory to the CPU, which ensures that the GPU can still speedily complete complex rendering operations without relying on the relatively high-latency system RAM.
Navigating the new normal: A fast guide to remote working
A smooth transition will support operations for years to comeDownload now
Leading the data race
The trends driving the future of data scienceDownload now
How to create 1:1 customer experiences at scale
Meet the technology capable of delivering the personalisation your customers craveDownload now
How to achieve daily SAP releases
Accelerate the pace of SAP change to support your digital strategyDownload now