π Introduction
In modern computing, caching is one of the most important techniques to make applications and systems faster. A cache stores frequently accessed data so future requests can be served quickly without recomputation. Among different caching strategies, the LRU Cache (Least Recently Used Cache) is one of the most popular. It removes the least recently used item whenever the cache reaches maximum capacity.
This article explains how to implement an LRU Cache with O(1) operations (get
and put
) using detailed diagrams. It covers both the theory and step-by-step examples, so even beginners can understand how it works.
π§© What is an LRU Cache?
An LRU (Least Recently Used) Cache stores only a limited number of items. When itβs full and a new item must be added, it removes the item that has not been used for the longest time.
Example: Imagine a fridge that can only store 3 items. If itβs already full and you add something new, the food you havenβt touched for the longest time gets removed.
This ensures that the most useful and fresh data always stays in memory.
π Key Operations in LRU Cache
get(key) β Returns the value if the key exists, otherwise returns -1.
put(key, value) β Inserts or updates a key-value pair. If the cache is full, it removes the least recently used item.
π The main challenge: Both operations must run in O(1) constant time.
βοΈ Data Structures Needed
To achieve O(1) time for both get
and put
, we use:
HashMap (Dictionary in Python) β For quick O(1) lookups to check if an item exists.
Doubly Linked List β To maintain the usage order (most recently used at the front, least recently used at the back).
By combining these:
The HashMap stores references to nodes in the Doubly Linked List.
The list tracks the order of usage, so we always know which item is least recently used.
ποΈ How the LRU Cache Works
1. Get Operation (get(key))
If the key exists:
If not found: return -1.
β
Complexity: O(1).
. Put Operation (put(key, value))
β
Complexity: O(1).
π₯οΈ Example Implementation in Python
class Node:
def __init__(self, key, value):
self.key = key
self.value = value
self.prev = None
self.next = None
class LRUCache:
def __init__(self, capacity: int):
self.capacity = capacity
self.cache = {} # HashMap
# Dummy head and tail for Doubly Linked List
self.head = Node(0, 0)
self.tail = Node(0, 0)
self.head.next = self.tail
self.tail.prev = self.head
def _remove(self, node):
prev, nxt = node.prev, node.next
prev.next, nxt.prev = nxt, prev
def _add(self, node):
node.prev = self.head
node.next = self.head.next
self.head.next.prev = node
self.head.next = node
def get(self, key: int) -> int:
if key in self.cache:
node = self.cache[key]
self._remove(node)
self._add(node)
return node.value
return -1
def put(self, key: int, value: int) -> None:
if key in self.cache:
self._remove(self.cache[key])
node = Node(key, value)
self._add(node)
self.cache[key] = node
if len(self.cache) > self.capacity:
lru = self.tail.prev
self._remove(lru)
del self.cache[lru.key]
π§ Step-by-Step Example with Diagrams (Capacity = 3)
Initial State
[H] <-> [T]
HashMap: {}
Step 1 β put(1, 'A')
[H] <-> [1:A] <-> [T]
HashMap: {1}
Step 2 β put(2, 'B')
[H] <-> [2:B] <-> [1:A] <-> [T]
HashMap: {1,2}
Step 3 β put(3, 'C')
[H] <-> [3:C] <-> [2:B] <-> [1:A] <-> [T]
HashMap: {1,2,3}
Step 4 β get(2)
[H] <-> [2:B] <-> [3:C] <-> [1:A] <-> [T]
HashMap: {1,2,3}
Step 5 β put(4, 'D')
[H] <-> [4:D] <-> [2:B] <-> [3:C] <-> [T]
HashMap: {2,3,4}
π Real-World Applications of LRU Cache
π₯οΈ Operating Systems: Managing memory pages.
π Web Browsers: Storing recently visited pages.
π± Mobile Apps: Keeping frequently accessed data.
π Databases: Improving query performance.
β
Summary
The LRU Cache is a powerful technique that ensures only the most recently used items remain in memory. By combining a HashMap and a Doubly Linked List, we can achieve O(1) time complexity for both get
and put
. This makes LRU Caches widely used in system design, operating systems, databases, mobile apps, and web browsers.