The O(n) Club: LRU Cache—Because Algorithms Have Bouncers

The O(n) Club: LRU Cache—Because Algorithms Have Bouncers

⚡ TL;DR

Stop letting your cache become a digital landfill. The LRU (Least Recently Used) Cache is designed to keep only the freshest, hottest keys in memory. Anything collecting dust? Evicted. Brute force folks rely on a HashMap and an O(n) search—bless their hearts. But you, superstar, need a combo: HashMap for collecting (keys to nodes), Doubly-Linked List for boss-level, O(1) rearrangement. Demo time:

// Set capacity to n
LRUCache cache = new LRUCache(n);
cache.put(key, value);
int value = cache.get(key); // returns -1 if absent

🧠 How Devs Usually Mess This Up

If you thought “HashMap is all I need”—surprise, you’ve invented slow eviction. HashMap knows what’s in the club but not who was last seen on the dance floor. So every eviction is a panicked O(n) search. Try a LinkedList for order? Prep for O(n) removal sadness. And don’t get me started on ignoring that both get() and put() mean “recent use.” If your code only bumps on writes… that eviction logic is going to be as detached as your last broken CI build.

🔍 What’s Actually Going On

Picture your cache as a velvet-rope nightclub: the hash map checks if someone’s on the list (O(1) entry), while the doubly-linked list keeps everyone queueing in “last-used” order. Every get or put? That key struts right to the front—VIP status updated. Run out of room? You don’t just pick someone at random for the bus home; you boot the poor soul at the very back (the LRU node). Slick, fast, and suspiciously like real memory systems. No wonder browsers, databases, and interviewers are obsessed.

🛠️ PseudoCode

  • Define a Node with key, value, prev, next.
    class Node {
      int key, value;
      Node prev, next;
    }
  • Bolt together a HashMap for key-to-node lookup.
    HashMap<Integer, Node> map = new HashMap<>();
  • Add head (MRU) and tail (LRU) dummy nodes for structure.
  • get(key):
    • If not in map? Return -1.
    • If found, yank node to head and return its value.
  • put(key, value):
    • If key exists, just update and promote to head.
    • Otherwise, build new node, stick on head.
    • If you’re over capacity, boot the tail.prev node—bye LRU!
  • Doubly-linked means all moves & removals are O(1)—no slow dances.

💻 The Code

class LRUCache {
    private class Node {
        int key, value;
        Node prev, next;
        Node(int key, int value) {
            this.key = key;
            this.value = value;
        }
    }
    private final int capacity;
    private Map<Integer, Node> map;
    private Node head, tail;
     public LRUCache(int capacity) {
        this.capacity = capacity;
        map = new HashMap<>();
        head = new Node(0, 0); // dummy head
        tail = new Node(0, 0); // dummy tail
        head.next = tail;
        tail.prev = head;
    }
     public int get(int key) {
        if (!map.containsKey(key)) return -1;
        Node node = map.get(key);
        remove(node);
        promote(node);
        return node.value;
    }
     public void put(int key, int value) {
        if (map.containsKey(key)) {
            Node node = map.get(key);
            node.value = value;
            remove(node);
            promote(node);
        } else {
            if (map.size() == capacity) {
                Node lru = tail.prev;
                remove(lru);
                map.remove(lru.key);
            }
            Node node = new Node(key, value);
            map.put(key, node);
            promote(node);
        }
    }
     private void remove(Node node) {
        node.prev.next = node.next;
        node.next.prev = node.prev;
    }
    private void promote(Node node) {
        node.next = head.next;
        node.prev = head;
        head.next.prev = node;
        head.next = node;
    }
}

⚠️ Pitfalls, Traps & Runtime Smacks

  • Evictions: Never randomly kick nodes. Always remove tail.prev—your bona fide LRU.
  • Bumping: Both get() and put() mean recent use. Don’t snub reads or you’ll get weird bugs.
  • No dups, please: Always update value and move to head, not add new nodes for old keys.
  • Runtime: O(1) for everything! The cache of your boss’s dreams.

🧠 Interviewer Brain-Tattoo

“A HashMap alone can’t party—bring a doubly-linked list unless you love O(n) walks of shame.”
Interviewers eat this up like free pizza.

Previous Article

The O(n) Club: Alien Dictionary: Sorting Out Topo-Sorted Antics

Next Article

The Side Effect Club: Emergence of Vector Databases: Overhauling the Data Infrastructure Landscape