Algorithms in C#  

How to Implement an LRU Cache with O(1) Operations

🌟 Introduction

In modern computing, caching is one of the most important techniques to make applications and systems faster. A cache stores frequently accessed data so future requests can be served quickly without recomputation. Among different caching strategies, the LRU Cache (Least Recently Used Cache) is one of the most popular. It removes the least recently used item whenever the cache reaches maximum capacity.

This article explains how to implement an LRU Cache with O(1) operations (get and put) using detailed diagrams. It covers both the theory and step-by-step examples, so even beginners can understand how it works.

🧩 What is an LRU Cache?

An LRU (Least Recently Used) Cache stores only a limited number of items. When it’s full and a new item must be added, it removes the item that has not been used for the longest time.

  • Example: Imagine a fridge that can only store 3 items. If it’s already full and you add something new, the food you haven’t touched for the longest time gets removed.

  • This ensures that the most useful and fresh data always stays in memory.

πŸ”‘ Key Operations in LRU Cache

  1. get(key) β†’ Returns the value if the key exists, otherwise returns -1.

  2. put(key, value) β†’ Inserts or updates a key-value pair. If the cache is full, it removes the least recently used item.

πŸ‘‰ The main challenge: Both operations must run in O(1) constant time.

βš™οΈ Data Structures Needed

To achieve O(1) time for both get and put, we use:

  1. HashMap (Dictionary in Python) β†’ For quick O(1) lookups to check if an item exists.

  2. Doubly Linked List β†’ To maintain the usage order (most recently used at the front, least recently used at the back).

By combining these:

  • The HashMap stores references to nodes in the Doubly Linked List.

  • The list tracks the order of usage, so we always know which item is least recently used.

πŸ—οΈ How the LRU Cache Works

1. Get Operation (get(key))

  • If the key exists:

    • Move the node to the front of the list (mark as most recently used).

    • Return the value.

  • If not found: return -1.

βœ… Complexity: O(1).

. Put Operation (put(key, value))

  • If the key exists:

    • Update the value.

    • Move the node to the front.

  • If the key does not exist:

    • Create a new node.

    • Insert it at the front.

    • Add to HashMap.

    • If the cache is full, remove the last node (least recently used) from both the list and the HashMap.

βœ… Complexity: O(1).

πŸ–₯️ Example Implementation in Python

class Node:
    def __init__(self, key, value):
        self.key = key
        self.value = value
        self.prev = None
        self.next = None

class LRUCache:
    def __init__(self, capacity: int):
        self.capacity = capacity
        self.cache = {}  # HashMap

        # Dummy head and tail for Doubly Linked List
        self.head = Node(0, 0)
        self.tail = Node(0, 0)
        self.head.next = self.tail
        self.tail.prev = self.head

    def _remove(self, node):
        prev, nxt = node.prev, node.next
        prev.next, nxt.prev = nxt, prev

    def _add(self, node):
        node.prev = self.head
        node.next = self.head.next
        self.head.next.prev = node
        self.head.next = node

    def get(self, key: int) -> int:
        if key in self.cache:
            node = self.cache[key]
            self._remove(node)
            self._add(node)
            return node.value
        return -1

    def put(self, key: int, value: int) -> None:
        if key in self.cache:
            self._remove(self.cache[key])
        node = Node(key, value)
        self._add(node)
        self.cache[key] = node

        if len(self.cache) > self.capacity:
            lru = self.tail.prev
            self._remove(lru)
            del self.cache[lru.key]

🧭 Step-by-Step Example with Diagrams (Capacity = 3)

Initial State

[H] <-> [T]
HashMap: {}

Step 1 β€” put(1, 'A')

[H] <-> [1:A] <-> [T]
HashMap: {1}

Step 2 β€” put(2, 'B')

[H] <-> [2:B] <-> [1:A] <-> [T]
HashMap: {1,2}

Step 3 β€” put(3, 'C')

[H] <-> [3:C] <-> [2:B] <-> [1:A] <-> [T]
HashMap: {1,2,3}

Step 4 β€” get(2)

  • Move 2 to the front (most recently used).

[H] <-> [2:B] <-> [3:C] <-> [1:A] <-> [T]
HashMap: {1,2,3}

Step 5 β€” put(4, 'D')

  • Cache is full β†’ Evict least recently used (key 1).

[H] <-> [4:D] <-> [2:B] <-> [3:C] <-> [T]
HashMap: {2,3,4}

🌍 Real-World Applications of LRU Cache

  • πŸ–₯️ Operating Systems: Managing memory pages.

  • 🌐 Web Browsers: Storing recently visited pages.

  • πŸ“± Mobile Apps: Keeping frequently accessed data.

  • πŸš€ Databases: Improving query performance.

βœ… Summary

The LRU Cache is a powerful technique that ensures only the most recently used items remain in memory. By combining a HashMap and a Doubly Linked List, we can achieve O(1) time complexity for both get and put. This makes LRU Caches widely used in system design, operating systems, databases, mobile apps, and web browsers.