LFU Cache
LFU Cache: The Historian
The Story: The Hall of Fame
Imagine a Hall of Fame that can only hold 5 statues.
- Who gets a statue? Not the person who did something cool yesterday (thatâs LRU).
- The statue goes to the person who has done cool things the most times in history.
If Michael Jordan (Frequency: 10,000) hasnât played a game in 10 years, he stays in the Hall. If a rookie (Frequency: 1) just scored 50 points last night, he still doesnât get in.
This is LFU (Least Frequently Used). It protects the âClassics.â It assumes that if a piece of data has been requested 500 times, it is fundamentally important, even if no one asked for it in the last hour.
Why do we need it?
- Static Assets: The Google Logo, the CSS reset file, the âHomeâ icon. These are accessed constantly. LRU might accidentally evict them during a âScanâ (a one-time burst of traffic). LFU protects them.
- CDN (Content Delivery Networks): Keeping the most popular videos cached at the edge, regardless of the time of day.
How the Algorithm âThinksâ
Implementing LFU in time is notoriously difficult. You need to:
- Track the
frequencyof every key. - When the cache is full, find the key with the
min(frequency). - If thereâs a tie (multiple keys with frequency 1), evict the
least recently usedamong them.
The solution requires a complex structure: A Map of Doubly Linked Lists.
Map<Frequency, List<Node>>: âList 1â holds all items with freq 1. âList 5â holds all items with freq 5.min_freq: A pointer to the lowest non-empty frequency list.
Access (Get):
- Find node.
freqbecomesfreq + 1. - Remove node from
List[freq]. - Add node to
List[freq + 1]. - Update
min_freqifList[min_freq]is empty.
- Find node.
Insert (Put):
- If new: Add to
List[1].min_freqbecomes 1. - If full: Go to
List[min_freq]. Remove the LRU node from that specific list.
- If new: Add to
Engineering Context: The Ghost of the Past
LFU has a fatal flaw: Memory Pollution.
- The Problem: Imagine a news article about the âOlympics 2020.â It gets 1 million hits in two weeks. Its frequency is huge.
- The Aftermath: Three years later, nobody cares. But because its frequency is 1 million, it sits in the cache forever, blocking new content.
- The Fix: Modern LFU (like Window-TinyLFU) uses âAgingâ or âDecay.â Every hour, divide all frequencies by 2. This forces old heroes to eventually retire.
Implementation (Python)
This is a simplified implementation using the âMap of Listsâ strategy.
import collections
class Node:
def __init__(self, key, value):
self.key = key
self.value = value
self.freq = 1
class LFUCache:
def __init__(self, capacity: int):
self.capacity = capacity
self.min_freq = 0
self.key_to_node = {} # Map<Key, Node>
self.freq_to_nodes = collections.defaultdict(collections.OrderedDict)
def get(self, key: int) -> int:
if key not in self.key_to_node:
return -1
node = self.key_to_node[key]
# 1. Promote frequency
val = node.value
self._update_freq(node)
return val
def _update_freq(self, node):
# Remove from current freq list
freq = node.freq
del self.freq_to_nodes[freq][node.key]
# If this list is empty and it was the min, increment min
if not self.freq_to_nodes[freq] and self.min_freq == freq:
self.min_freq += 1
# Add to next freq list
node.freq += 1
self.freq_to_nodes[node.freq][node.key] = node
def put(self, key: int, value: int) -> None:
if self.capacity == 0: return
if key in self.key_to_node:
node = self.key_to_node[key]
node.value = value
self._update_freq(node)
return
if len(self.key_to_node) >= self.capacity:
# Evict from the min_freq list
# popitem(last=False) pops the first item (FIFO/LRU behavior for ties)
k, v = self.freq_to_nodes[self.min_freq].popitem(last=False)
del self.key_to_node[k]
# Add new node
new_node = Node(key, value)
self.key_to_node[key] = new_node
self.freq_to_nodes[1][key] = new_node
self.min_freq = 1Summary
LFU teaches us that loyalty matters. By valuing long-term popularity over short-term trends, it creates a stable environment for the âClassics.â But it also warns us: never let the past weigh down the future. Even legends must eventually fade to make room for the new generation.
