# LRU Cache

**1. Clarify the problem:**
Before diving into solving the problem, let's clarify the requirements:

- We need to design an LRU (Least Recently Used) cache that supports get and put operations.
- The cache should have a fixed capacity.
- The get operation retrieves the value of a key if the key exists in the cache and moves it to the most recently used position.
- The put operation sets or inserts the value if the key is not already present. If the cache reaches its capacity, it should evict the least recently used item before inserting a new item.

**2. Analyze the problem:**
To solve this problem, we can use a combination of a hash map and a doubly linked list to efficiently perform the get and put operations with O(1) time complexity.

**3. Design an algorithm:**
Here is the algorithm to design an LRU cache:

- Implement a doubly linked list node class, which will contain the key, value, and references to the previous and next nodes.
- Implement an LRU cache class that contains a hash map and the necessary operations:
- Initialize the cache with the given capacity.
- Implement the get operation:
- If the key is not present in the cache, return -1.
- If the key is present, retrieve its value, move the corresponding node to the most recently used position in the doubly linked list, and return the value.

- Implement the put operation:
- If the key is already present, update its value, move the corresponding node to the most recently used position in the doubly linked list, and return.
- If the key is not present, create a new node with the key and value, add it to the front of the doubly linked list, and insert the key-node pair into the hash map.
- If the cache has reached its capacity, remove the least recently used node from the doubly linked list and the hash map.

- The doubly linked list will help maintain the order of recently used items, and the hash map will provide O(1) access to cache items.

**4. Explain your approach:**
The approach involves using a doubly linked list and a hash map to implement the LRU cache. The doubly linked list will store the key-value pairs, and the hash map will provide efficient access to the nodes in the linked list. The most recently used items will be located at the front of the linked list, and the least recently used items will be at the end. Whenever a get or put operation is performed, the corresponding node will be moved to the front of the linked list to mark it as the most recently used item.

**5. Write clean and readable code:**

python

```
class Node:
def __init__(self, key, value):
self.key = key
self.value = value
self.prev = None
self.next = None
class LRUCache:
def __init__(self, capacity):
self.capacity = capacity
self.cache = {}
self.head = Node(None, None)
self.tail = Node(None, None)
self.head.next = self.tail
self.tail.prev = self.head
def get(self, key):
if key in self.cache:
node = self.cache[key]
self._move_to_front(node)
return node.value
return -1
def put(self, key, value):
if key in self.cache:
node = self.cache[key]
node.value = value
self._move_to_front(node)
else:
if len(self.cache) == self.capacity:
self._remove_least_recently_used()
node = Node(key, value)
self.cache[key] = node
self._add_to_front(node)
def _move_to_front(self, node):
self._remove_node(node)
self._add_to_front(node)
def _remove_least_recently_used(self):
node = self.tail.prev
self._remove_node(node)
del self.cache[node.key]
def _remove_node(self, node):
node.prev.next = node.next
node.next.prev = node.prev
def _add_to_front(self, node):
node.prev = self.head
node.next = self.head.next
self.head.next.prev = node
self.head.next = node
```

**6. Test your code:**
Let's test the code with different test cases, including both basic and edge cases, to ensure its correctness and functionality. We can test scenarios such as adding elements within capacity, adding elements beyond capacity (which will trigger eviction), updating existing elements, and retrieving elements.

python

```
# Create a cache with capacity 2
cache = LRUCache(2)
# Test case 1: Add and retrieve elements
cache.put(1, 1)
cache.put(2, 2)
print(cache.get(1)) # Output: 1
print(cache.get(2)) # Output: 2
# Test case 2: Add element beyond capacity, triggering eviction
cache.put(3, 3)
print(cache.get(1)) # Output: -1 (key 1 is evicted)
print(cache.get(2)) # Output: 2
print(cache.get(3)) # Output: 3
# Test case 3: Update existing element
cache.put(2, 4)
print(cache.get(2)) # Output: 4
# Test case 4: Add elements within capacity
cache.put(4, 5)
cache.put(5, 6)
print(cache.get(2)) # Output: -1 (key 2 is evicted)
print(cache.get(4)) # Output: 5
print(cache.get(5)) # Output: 6
```

**7. Optimize if necessary:**
The provided solution already has an optimal time complexity of O(1) for both get and put operations. No further optimization is required.

**8. Handle error cases:**
The code assumes that the capacity is a positive integer. It does not handle cases where the capacity is negative or zero. If the capacity is set as zero, the cache will not store any elements, and all get and put operations will return -1.

**9. Discuss complexity analysis:**

The time and space complexity of the LRU Cache solution can be analyzed as follows:

Time Complexity:

- Get Operation: O(1)
- Retrieving a value from the cache involves a constant-time lookup in the hash map, which has an average-case and worst-case time complexity of O(1).
- Moving the corresponding node to the front of the doubly linked list also requires constant time.

- Put Operation: O(1)
- Updating the value of an existing key or adding a new key involves a constant-time lookup and update in the hash map, which has an average-case and worst-case time complexity of O(1).
- Adding a new node to the front of the doubly linked list and removing the least recently used node can also be done in constant time.

- Overall, both the get and put operations have a time complexity of O(1), providing efficient constant-time access to cache items.

- Get Operation: O(1)
Space Complexity:

- The space complexity of the LRU Cache solution is O(capacity) since we store a maximum of 'capacity' nodes in the cache.
- The doubly linked list and the hash map both require additional space to store the nodes and key-value pairs, respectively.
- The maximum space occupied by the cache is proportional to the capacity.

Best-case, Worst-case, and Average-case Scenarios:

- In the best-case scenario, where the cache operations are evenly distributed, the get and put operations will have a time complexity of O(1), providing constant-time access to cache items.
- In the worst-case scenario, where there is a high frequency of evictions and new additions, the cache will constantly reach its capacity, resulting in the eviction of the least recently used item for each put operation. This can lead to a higher number of removals and insertions in the doubly linked list, but the time complexity of the operations will still be O(1).
- In the average-case scenario, the time complexity of the get and put operations will still be O(1) due to the efficient implementation of the cache using a hash map and a doubly linked list.

Trade-offs:

- The chosen data structures, a hash map and a doubly linked list, provide efficient operations for the LRU Cache solution. However, they require additional space to store the nodes and key-value pairs.
- The trade-off made in this solution is between time complexity and space complexity. By using a combination of a hash map and a doubly linked list, we achieve efficient O(1) time complexity for get and put operations at the expense of additional space.
- If optimizing for space usage is a priority, alternative data structures or caching strategies could be considered. However, they may result in a higher time complexity for cache operations.