Leetcode 146: LRU cache

https://leetcode.com/problems/lru-cache/

LRU Cache

Total Accepted: 50542 Total Submissions: 329169 Difficulty: Hard

Design and implement a data structure for Least Recently Used (LRU) cache. It should support the following operations: get and set.

get(key) – Get the value (will always be positive) of the key if the key exists in the cache, otherwise return -1.
set(key, value) – Set or insert the value if the key is not already present. When the cache reached its capacity, it should invalidate the least recently used item before inserting a new item.

Code:

class Node(object):
    def __init__(self, val, key=None):
        self._prev = None
        self._next = None
        self._val = val
        self._key = key
    
    @property
    def val(self):
        return self._val
    
    @val.setter
    def val(self, value):
        self._val = value
        
    @property
    def key(self):
        return self._key

    @property
    def prev(self):
        return self._prev 

    @prev.setter
    def prev(self, node):
        self._prev = node     
   
    @property
    def next(self):
        return self._next

    @next.setter
    def next(self, node):
        self._next = node

class LRUCache(object):

    def __init__(self, capacity):
        """
        :type capacity: int
        """
        self.capacity = capacity
        self.table = {}
        self.dummy_head = Node(-1)
        self.dummy_tail = Node(-1)
        self.dummy_head.next = self.dummy_tail
        self.dummy_tail.prev = self.dummy_head

    def get(self, key):
        """
        :rtype: int
        """
        if self.table.get(key, None):
            node = self.table.get(key)
            self.del_node(node)
            self.tail_node(node)
            return node.val
        else:
            return -1

    def set(self, key, value):
        """
        :type key: int
        :type value: int
        :rtype: nothing
        """
        if self.table.get(key, None):
            node = self.table.get(key)
            node.val = value
            self.del_node(node)
            self.tail_node(node)            
        else:
            node = Node(value, key)
            self.table[key] = node
            self.tail_node(node)

            if len(self.table) > self.capacity: 
                node_del = self.dummy_head.next
                self.del_node(node_del)
                self.table.pop(node_del.key, None)

    def del_node(self, node):
        node.prev.next = node.next
        node.next.prev = node.prev

    def tail_node(self, node):
        node.prev = self.dummy_tail.prev
        node.next = self.dummy_tail
        self.dummy_tail.prev.next = node
        self.dummy_tail.prev = node

    
        

Idea:

First, we want to create a double linked list storing caches. Using this data structure, we can insert and delete cache in O(1) time. To be more convenient, we have a dummy head and dummy tail at the beginning and the end of the double linked list. All caches will be inserted between dummy_head and dummy_tail. The cache closer to dummy_head is less recently used. The cache closer to dummy_tail is more recently used. In `LRU`, we also have a `self.table` hash table to record key-node mapping.

Now, when you set a cache, you check whether it is stored in the double linked list before. If it is (line 67), we update its value. Then we remove it from the double linked list temporarily and then add it to the tail. If the cache is not stored before, you have to add its key to self.table as well as add it to the tail. However, this may cause the cache size overflow. To deal with it, you need to remove the next element dummy_head points to.

A caveat is that `get` function will not simply return a cache’s value but also needs to put that cache to the tail of the double linked list (line 55-56).

 

Reference:

https://leetcode.com/discuss/20139/java-hashtable-double-linked-list-with-touch-of-pseudo-nodes

 

Leave a comment

Your email address will not be published. Required fields are marked *