146. LRU Cache
Design a data structure that follows the constraints of a Least Recently Used (LRU) cache.
Implement the LRUCache class:
- LRUCache(int capacity) Initialize the LRU cache with positive size capacity.
- int get(int key) Return the value of the key if the key exists, otherwise return -1.
- void put(int key, int value) Update the value of the key if the key exists. Otherwise, add the key-value pair to the cache. If the number of keys exceeds the capacity from this operation, evict the least recently used key.
The functions get and put must each run in O(1) average time complexity.
Example 1:
Input
["LRUCache", "put", "put", "get", "put", "get", "put", "get", "get", "get"]
[[2], [1, 1], [2, 2], [1], [3, 3], [2], [4, 4], [1], [3], [4]]
Output
[null, null, null, 1, null, -1, null, -1, 3, 4]
Explanation
LRUCache lRUCache = new LRUCache(2);
lRUCache.put(1, 1); // cache is {1=1}
lRUCache.put(2, 2); // cache is {1=1, 2=2}
lRUCache.get(1); // return 1
lRUCache.put(3, 3); // LRU key was 2, evicts key 2, cache is {1=1, 3=3}
lRUCache.get(2); // returns -1 (not found)
lRUCache.put(4, 4); // LRU key was 1, evicts key 1, cache is {4=4, 3=3}
lRUCache.get(1); // return -1 (not found)
lRUCache.get(3); // return 3
lRUCache.get(4); // return 4
Constraints:文章来源:https://www.toymoban.com/news/detail-496818.html
- 1 <= capacity <= 3000
- 0 <= key <= 104
- 0 <= value <= 105
- At most 2 * 105 calls will be made to get and put.
双向链表解法,ListNode
class ListNode:
def __init__ (self, key, val):
self.key = key
self.val = val
self.pre = None
self.next = None
class LRUCache:
def __init__(self, capacity: int):
self.capacity = capacity
self.dict = {}
self.head = ListNode(-1, -1)
self.tail = ListNode(-1, -1)
self.head.next = self.tail
self.tail.pre = self.head
def get(self, key: int) -> int:
if key not in self.dict:
return -1
node = self.dict[key]
self.remove(node)
self.add(node)
return node.val
def put(self, key: int, value: int) -> None:
if key in self.dict:
old_node = self.dict[key]
self.remove(old_node)
node = ListNode(key, value)
self.dict[key] = node
self.add(node)
if len(self.dict) > self.capacity:
node_to_delete = self.head.next
self.remove(node_to_delete)
del self.dict[node_to_delete.key]
def add(self, node: ListNode):
previous_end = self.tail.pre
previous_end.next = node
node.pre = previous_end
node.next = self.tail
self.tail.pre = node
def remove(self, node: ListNode):
node.pre.next = node.next
node.next.pre = node.pre
# Your LRUCache object will be instantiated and called as such:
# obj = LRUCache(capacity)
# param_1 = obj.get(key)
# obj.put(key,value)
Complexity Analysis文章来源地址https://www.toymoban.com/news/detail-496818.html
- Time complexity: O(1) for both get and put.
For get:- Check if a key is in a hash map. This costs O(1).
- Get a node associated with a key. This costs O(1).
- Call remove and add. Both methods cost O(1).
- For put:
- Check if a key is in a hash map. This costs O(1).
- If it is, we get a node associated with a key and call remove. Both cost O(1).
- Create a new node and insert it into the hash map. This costs O(1).
- Call add. This method costs O(1).
- If the capacity is exceeded, we call remove and delete from the hash map. Both cost O(1).
- Space complexity: O(capacity)
We use extra space for the hash map and for our linked list. Both cannot exceed a size of capacity.
Collections LRUCache 解法
import collections
class LRUCache:
def __init__(self, capacity: int):
self.capacity = capacity
self.dict = collections.OrderedDict()
def get(self, key: int) -> int:
if key not in self.dict:
return -1
self.dict.move_to_end(key)
return self.dict[key]
def put(self, key: int, value: int) -> None:
if key in self.dict:
self.dict.move_to_end(key)
self.dict[key] = value
if len(self.dict) > self.capacity:
self.dict.popitem(False)
# Your LRUCache object will be instantiated and called as such:
# obj = LRUCache(capacity)
# param_1 = obj.get(key)
# obj.put(key,value)
到了这里,关于146. LRU Cache最近最少使用 (LRU) 缓存 Least Recently Used (LRU) cache.的文章就介绍完了。如果您还想了解更多内容,请在右上角搜索TOY模板网以前的文章或继续浏览下面的相关文章,希望大家以后多多支持TOY模板网!