LinkedHashMap Example: LRU Cache Implementation

LRU cache can be implemented fairly easily using LinkedHashMap. In this article we examine what is LRU cache and how to implement it in Java.

What is LRU Cache?

LRU cache stands for least-recently-used cache. Before we examine the LRU policy, let’s take a moment to see what a cache is.

What is a cache?

In computer science a cache is a limited storage facility where items are stored for faster access.

Cache may be implemented in memory or on storage devices. For the purposes of this discussion, we will confine to in-memory caches. There may be caches in hardware or you may use a portion of main memory as a cache.

A in-memory cache in computer science is like any other memory area with two differences –

  1. Cache is limited in size. There is a notion of maximum capacity always attached to cache.
  2. Cache has some potential benefits, usually faster access.

Some people refer to the process of storing something in memory as caching. Although it is not technically incorrect, it goes against the common usage of the term.

In computer science a software cache is always implemented using the ADT (abstract data type) Table. Hash table is the most popular cache. The second most popular data structure for cache implementation is array. In Java, cache is implemented using Map or sometimes using its first cousin Set.

What is a LRU Cache?

Since a cache is always limited in size, it is essential to formulate a policy of action when the number of elements in the cache is about to exceed its capacity.

For example, if you have a LRU cache of size 10 elements, it would be pretty unwise to throw away the 11th (next) element we encounter. After all, the next element is recent most and it implicitly carries with it recency information (what is trending at the moment).

Therefore, there are number of popular cache policies – Least Frequently Used, Most Recently Used and Least Recently Used.

The LRU cache evicts the element that has been least recently used. This caching policy is based on the idea that an element which has not been used for a while is likely to be not used again in the future.

For example, suppose you want to implement a data structure to preserves the browsing history (i.e. URLs) of your browser. A LRU cache is pretty well suited for this task because people tend to visit recent websites again and again and forget the websites that they haven’t visited in a while.

Java Implementation of LRU Cache

We can use a LinkedHashMap with access order as a LRU cache. Access ordered LinkedHashMap keeps the elements ordered by their last accessed times. This access time order of LinkedHashMap can be used to simulate the least-recently-used part of the LRU policy.

The only other thing left for the LRU cache implementation is to impose a size restriction. We can do this by sub-classing LinkedHashMap and overriding the protected method removeEldestEntry. This method is called after put and putAll methods. This method returns a boolean to indicate whether oldest entry should be removed or not. In a normal Map, this method returns false. This is because there is no theoretical limit on the size of Map. But a cache is size restricted so we will override this method to return true if the size becomes higher than capacity.

LRU Cache Java Class

The code below shows the LRU implementation –

Using the LRU Class

Let’s use our LRU cache to keep a record of websites we visit. For demonstration, we keep the cache size very small (only 2). Therefore, the third URL we have to store will have to evict one of the exist URLs from the cache. Just to illustrate the point we visit one of the URLs twice so that the other URL gets selected for eviction.

The code is shown below –

The output of the code is shown next –

That’s how simple it is to write a LRU cache in Java!

Leave a comment

Your email address will not be published.