Introduction
In this article we will learn about what is lru cache and how we can implement it in python. LRU derrived as Least Recently Used cache. LRU is an powerful data structure its use for maintains a limited set of items while automatically removing the least recently accessed ones. Its perfevtly use forscenarios where you want to keep frequently accessed data readily available.
Implementation of LRU( Least Recently Used ) cache
1. Node Class
class Node:
def __init__(self, key, value):
self.key = key
self.value = value
self.prev = None
self.next = None
This defines a Node for our doubly linked list.
- Stores a key-value pair
- Has references to previous and next nodes
- These connections allow us to quickly remove and reposition nodes
2. LRUCache Initialization
def __init__(self, capacity):
self.capacity = capacity
self.cache = {}
# Create dummy head and tail nodes for initialization
self.head = Node(0, 0)
self.tail = Node(0, 0)
self.head.next = self.tail
self.tail.prev = self.head
- Sets the maximum capacity of the cache
- Creates an empty dictionary to map keys to nodes for O(1) search
- Creates dummy head and tail nodes
- Links these dummy nodes together to form an empty list structure
3. Node Management Methods
Add Node
def _add_node(self, node):
# Add node data right after head
node.prev = self.head
node.next = self.head.next
self.head.next.prev = node
self.head.next = node
- Adds a new node right after the head (most recently used position)
- Updates connections in a specific order to avoid breaking the list
Remove Node
def _remove_node(self, node):
# Remove an existing node from the linked list
prev = node.prev
new = node.next
prev.next = new
new.prev = prev
- Removes a node by updating the connections of its neighbors
- The node itself becomes disconnected from the list
Move Node
def _move_to_head(self, node):
# Move certain node to the head
self._remove_node(node)
self._add_node(node)
- Combines removal and addition to move an existing node to the head
- Used when an item is accessed to mark it as most recently used
def _pop_tail(self):
# Pop the last node
res = self.tail.prev
self._remove_node(res)
return res
- Removes and returns the node right before the tail (least recently used)
- Used when we need to evict an item to make room
4. Cache Operations
def get(self, key):
node = self.cache.get(key, None)
if not node:
return -1
# Move the accessed node to head
self._move_to_head(node)
return node.value
- Looks up the key in our dictionary in O(1) time
- If found, moves the node to the head (marks as recently used), which is the most important part of the algorithm
- Returns the value or -1 if not found
def put(self, key, value):
node = self.cache.get(key)
if not node:
newNode = Node(key, value)
self.cache[key] = newNode
self._add_node(newNode)
if len(self.cache) > self.capacity:
# Pop the tail
tail = self._pop_tail()
del self.cache[tail.key]
else:
# Update the value
node.value = value
self._move_to_head(node)
- If the key exists, updates its value and moves it to the head
- If the key is new:
- Creates a new node
- Adds it to the dictionary
- Adds it to the front of the list
- If over capacity, remove the least recently used item (from the tail)
- Also removes this evicted item from the dictionary
Combine Example of LRU cache
class Node:
def __init__(self, key, value):
self.key = key
self.value = value
self.prev = None
self.next = None
class LRUCache:
def __init__(self, capacity):
self.capacity = capacity
self.cache = {}
# Create dummy data head and tail nodes
self.head = Node(0, 0)
self.tail = Node(0, 0)
self.head.next = self.tail
self.tail.prev = self.head
def _add_node(self, node):
# Add node right after head
node.prev = self.head
node.next = self.head.next
self.head.next.prev = node
self.head.next = node
def _remove_node(self, node):
# Remove an existing node from the linked list
prev = node.prev
new = node.next
prev.next = new
new.prev = prev
def _move_to_head(self, node):
# Move certain node to the head
self._remove_node(node)
self._add_node(node)
def _pop_tail(self):
# Pop the last node
res = self.tail.prev
self._remove_node(res)
return res
def get(self, key):
node = self.cache.get(key, None)
if not node:
return -1
# Move the accessed node to head
self._move_to_head(node)
return node.value
def put(self, key, value):
node = self.cache.get(key)
if not node:
newNode = Node(key, value)
self.cache[key] = newNode
self._add_node(newNode)
if len(self.cache) > self.capacity:
# Pop the tail
tail = self._pop_tail()
del self.cache[tail.key]
else:
# Update the value
node.value = value
self._move_to_head(node)
# Create a cache with capacity of 2
cache = LRUCache(2)
cache.put(1, 1) # Cache: {1=1}
cache.put(2, 2) # Cache: {1=1, 2=2}
print(cache.get(1)) # Returns 1, Cache: {2=2, 1=1}
cache.put(3, 3) # Removes key 2, Cache: {1=1, 3=3}
print(cache.get(2)) # Returns -1 (not found)
cache.put(4, 4) # Removes key 1, Cache: {3=3, 4=4}
print(cache.get(1)) # Returns -1 (not found)
print(cache.get(3)) # Returns 3, Cache: {4=4, 3=3}
print(cache.get(4)) # Returns 4, Cache: {3=3, 4=4}
Output
1
-1
-1
3
4
![LRU]()
Uses of LRU Cache
- Database query caching
- Web server response caching
- Browser history management
Summary
By implementing an LRU cache, you can significantly improve the performance of applications that repeatedly access expensive-to-compute data