LeetCopilot Logo
LeetCopilot
Home/Blog/LRU Cache (LeetCode 146): The Complete Guide to Solving It

LRU Cache (LeetCode 146): The Complete Guide to Solving It

Alex Wang
Jan 6, 2026
12 min read
LeetCodeLRU CacheData StructuresHash MapLinked List
LRU Cache is one of the most asked interview questions at FAANG. Here's how to solve it step-by-step with Hash Map + Doubly Linked List.

LRU Cache (LeetCode #146) is one of the most frequently asked interview questions at Google, Meta, Amazon, and Microsoft. It combines data structure design with practical caching concepts.

This guide covers everything you need to know to solve LRU Cache confidently in interviews.

TL;DR: Quick Solution

Data Structures: Hash Map + Doubly Linked List

Time Complexity: O(1) for both get and put

Key Insight: The hash map gives O(1) lookup. The doubly linked list gives O(1) insertion/deletion at both ends.

The Problem

Design a data structure that follows the constraints of a Least Recently Used (LRU) cache.

Implement the LRUCache class:

  • LRUCache(int capacity) — Initialize with positive capacity
  • int get(int key) — Return value if exists, else -1
  • void put(int key, int value) — Update or insert. Evict LRU if at capacity.

Constraint: Both operations must run in O(1) average time.

Why This Problem is Tricky

The challenge is achieving O(1) for both operations:

OperationChallenge
getNeed O(1) key lookup
put (insert)Need O(1) insert at front
put (evict)Need O(1) removal of LRU item
Update "recently used"Need O(1) move to front

Single data structure won't work:

  • Hash Map: O(1) lookup, but no ordering
  • Linked List: O(1) insert/delete, but O(n) lookup
  • Array: O(n) for removal

Solution: Combine both!

The Key Insight

Use a Hash Map + Doubly Linked List together:

StructurePurpose
Hash MapKey → Node (O(1) lookup)
Doubly Linked ListTrack recency order (O(1) move/remove)

How It Works:

  1. Most recently used → Front of list
  2. Least recently used → Back of list
  3. Hash Map → Direct access to any node

Step-by-Step Solution

Step 1: Define the Node

python
class Node:
    def __init__(self, key, val):
        self.key = key
        self.val = val
        self.prev = None
        self.next = None

Each node stores:

  • key — needed for eviction (to remove from hash map)
  • val — the actual value
  • prev, next — doubly linked list pointers

Step 2: Initialize the Cache

python
class LRUCache:
    def __init__(self, capacity):
        self.capacity = capacity
        self.cache = {}  # key -> Node
        
        # Dummy head and tail (simplifies edge cases)
        self.head = Node(0, 0)
        self.tail = Node(0, 0)
        self.head.next = self.tail
        self.tail.prev = self.head

Why dummy nodes?

  • No null checks needed
  • Insert/remove code is cleaner
  • Head.next = most recent
  • Tail.prev = least recent

Step 3: Helper Methods

python
def _remove(self, node):
    """Remove node from linked list"""
    prev, nxt = node.prev, node.next
    prev.next = nxt
    nxt.prev = prev

def _insert(self, node):
    """Insert node at front (most recent)"""
    prev, nxt = self.head, self.head.next
    prev.next = node
    nxt.prev = node
    node.prev = prev
    node.next = nxt

Step 4: Implement get()

python
def get(self, key):
    if key in self.cache:
        node = self.cache[key]
        # Move to front (mark as recently used)
        self._remove(node)
        self._insert(node)
        return node.val
    return -1

Logic:

  1. If key exists: move to front, return value
  2. If key doesn't exist: return -1

Step 5: Implement put()

python
def put(self, key, value):
    if key in self.cache:
        # Remove old node
        self._remove(self.cache[key])
    
    # Create and insert new node
    node = Node(key, value)
    self.cache[key] = node
    self._insert(node)
    
    # Evict if over capacity
    if len(self.cache) > self.capacity:
        # Remove LRU (node before tail)
        lru = self.tail.prev
        self._remove(lru)
        del self.cache[lru.key]

Logic:

  1. If key exists: remove old node
  2. Create new node, add to cache and list
  3. If over capacity: evict LRU (tail.prev)

Complete Solution

python
class Node:
    def __init__(self, key, val):
        self.key = key
        self.val = val
        self.prev = None
        self.next = None

class LRUCache:
    def __init__(self, capacity):
        self.capacity = capacity
        self.cache = {}  # key -> Node
        
        # Dummy head and tail
        self.head = Node(0, 0)
        self.tail = Node(0, 0)
        self.head.next = self.tail
        self.tail.prev = self.head
    
    def _remove(self, node):
        prev, nxt = node.prev, node.next
        prev.next = nxt
        nxt.prev = prev
    
    def _insert(self, node):
        prev, nxt = self.head, self.head.next
        prev.next = node
        nxt.prev = node
        node.prev = prev
        node.next = nxt
    
    def get(self, key):
        if key in self.cache:
            node = self.cache[key]
            self._remove(node)
            self._insert(node)
            return node.val
        return -1
    
    def put(self, key, value):
        if key in self.cache:
            self._remove(self.cache[key])
        
        node = Node(key, value)
        self.cache[key] = node
        self._insert(node)
        
        if len(self.cache) > self.capacity:
            lru = self.tail.prev
            self._remove(lru)
            del self.cache[lru.key]

Complexity Analysis

OperationTimeSpace
getO(1)
putO(1)
OverallO(1)O(capacity)

Visual Walkthrough

Initial State (capacity = 2):

code
head <-> tail
cache = {}

After put(1, 1):

code
head <-> [1:1] <-> tail
cache = {1: Node(1,1)}

After put(2, 2):

code
head <-> [2:2] <-> [1:1] <-> tail
cache = {1: Node(1,1), 2: Node(2,2)}

After get(1):

code
head <-> [1:1] <-> [2:2] <-> tail  (1 moved to front)

After put(3, 3) — evicts 2:

code
head <-> [3:3] <-> [1:1] <-> tail
cache = {1: Node(1,1), 3: Node(3,3)}  (2 evicted)

Common Mistakes

MistakeFix
Forgetting to store key in nodeNode needs key for eviction
Using singly linked listNeed doubly linked for O(1) removal
Not updating cache on putUpdate cache even if key exists
Off-by-one in capacity checkCheck after insert, before evict

Python Shortcut: OrderedDict

Python's OrderedDict combines hash map + ordering:

python
from collections import OrderedDict

class LRUCache:
    def __init__(self, capacity):
        self.capacity = capacity
        self.cache = OrderedDict()
    
    def get(self, key):
        if key in self.cache:
            self.cache.move_to_end(key)
            return self.cache[key]
        return -1
    
    def put(self, key, value):
        if key in self.cache:
            self.cache.move_to_end(key)
        self.cache[key] = value
        if len(self.cache) > self.capacity:
            self.cache.popitem(last=False)

Note: Know the manual solution first—interviewers may ask you to implement without built-ins.

Interview Tips

When Explaining:

  1. Start with the O(1) requirement
  2. Explain why you need two data structures
  3. Draw the linked list with dummy nodes
  4. Walk through an example

Follow-Up Questions:

  • "How would you handle concurrency?" → Use locks or concurrent data structures
  • "What if capacity is 0?" → Handle as edge case
  • "Can you implement LFU instead?" → LeetCode 460

Why This Problem is Asked

LRU Cache tests:

  • Data structure design — combining structures
  • Time complexity awareness — achieving O(1)
  • Edge case handling — eviction, updates
  • Practical knowledge — caching is everywhere

Real-world uses:

  • Database query caches
  • Web browser caches
  • Operating system page replacement
  • CDN caching
ProblemDifficultyConcept
LFU Cache (#460)HardFrequency-based eviction
Design HashMap (#706)EasyHash table basics
Design LinkedList (#707)MediumLinked list implementation
All O'one Data Structure (#432)HardMax/min tracking

Practice Strategy

  1. Understand the concept — Draw the linked list
  2. Code from scratch — Don't memorize, understand
  3. Test with examples — Walk through operations
  4. Use LeetCopilot — Get hints if stuck

FAQ

Why doubly linked list, not singly?
Singly linked list needs O(n) to find previous node for removal. Doubly linked list stores prev pointer for O(1) removal.

Why store key in the node?
When evicting, you need to remove from hash map too. The key is needed for that deletion.

Can I use a deque?
Deque gives O(n) for arbitrary removal. You need O(1) removal when accessing existing keys.

Conclusion

LRU Cache is a classic interview problem that tests your ability to combine data structures for optimal performance.

Key takeaways:

  1. Hash Map → O(1) key lookup
  2. Doubly Linked List → O(1) ordering operations
  3. Dummy nodes → Simplify edge cases
  4. Store key in node → Needed for eviction

Practice until you can implement it from scratch in under 15 minutes.

Good luck!

Want to Practice LeetCode Smarter?

LeetCopilot is a free browser extension that enhances your LeetCode practice with AI-powered hints, personalized study notes, and realistic mock interviews — all designed to accelerate your coding interview preparation.

Also compatible with Edge, Brave, and Opera

Related Articles