Skip to the content.

Cache Algorithms

A. LRU, LFU and Random
LRU, LFU, and random are the common cache replacement policies.
Image

LRU (Least Recently Used)
LRU represents the least recently used algorithm and it is one of the most famous algorithms. The name LRU implies that it keeps the least recently used objects at the top and evicts objects that have not been used when the list reaches the maximum capacity.



LFU (Least Frequently Used)
LFU means the least frequently used algorithm and it monitors how many times it was accessed. Each object is associated with a counter which counts how many times it was accessed. If the list reaches the maximum capacity, objects with the lowest counters are evicted.



Random
In a random algorithm, it works very simple. It randomly selects an object and evicts it when it reaches maximum capacity. Every cache entry has the same probability of being replaced.



Every algorithm contains its advantages and disadvantages in using it, people should use it depending on the specific needs of the system.

B. Impact of cache algorithms on cache hit rates
The cache algorithms are crucial in affecting the cache hit rates. The cache replacement policy determines which object to be evicted if a new object is inserted. The LRU and LFU would enhance the possibility of cache hits as they keep recently accessed data and frequently used data. The random replacement would result in poor hit rates due to the random nature of eviction.


By Ng Wing Hei (56612889)


Click to view the source code of this page.

Table of Contents | Home Page