1. Introduction

An LRU (Least Recently Used) cache is a type of cache in which the least recently used entries are removed when the cache reaches its size limit. The main idea behind an LRU cache is to prioritize the most recently accessed items and discard the least recently accessed ones. In this post, we will implement an LRU cache using a doubly-linked list and a hash map for constant-time access.

2. Program Steps

1. Define a node class Node which will store the key-value pair and have pointers to next and previous nodes.

2. Define the LRUCache class.

3. Initialize a doubly linked list and a hash map to manage cache items.

4. Create methods get and put to manage the cache.

5. Implement an algorithm to determine which item to evict when the cache reaches its limit.

3. Code Program

import java.util.HashMap;

class LRUCache {

    class Node {
        int key, value;
        Node prev, next;
        Node(int key, int value) {
            this.key = key;
            this.value = value;
        }
    }

    private HashMap<Integer, Node> map;
    private int capacity, count;
    private Node head, tail;

    public LRUCache(int capacity) {
        this.capacity = capacity;
        this.count = 0;
        map = new HashMap<>();
        head = new Node(0, 0);
        tail = new Node(0, 0);
        head.next = tail;
        tail.prev = head;
        head.prev = null;
        tail.next = null;
    }

    // Add node to the front of the list
    private void addToHead(Node node) {
        node.next = head.next;
        node.next.prev = node;
        node.prev = head;
        head.next = node;
    }

    // Remove a node from the list
    private void removeNode(Node node) {
        node.prev.next = node.next;
        node.next.prev = node.prev;
    }

    public int get(int key) {
        if (map.get(key) != null) {
            Node node = map.get(key);
            int result = node.value;
            removeNode(node);
            addToHead(node);
            return result;
        }
        return -1;
    }

    public void put(int key, int value) {
        if (map.get(key) != null) {
            Node node = map.get(key);
            node.value = value;
            removeNode(node);
            addToHead(node);
        } else {
            Node newNode = new Node(key, value);
            map.put(key, newNode);
            if (count < capacity) {
                count++;
                addToHead(newNode);
            } else {
                map.remove(tail.prev.key);
                removeNode(tail.prev);
                addToHead(newNode);
            }
        }
    }

    public static void main(String[] args) {
        LRUCache cache = new LRUCache(2); // Capacity of 2
        cache.put(1, 10);
        cache.put(2, 20);
        System.out.println("Value for key 1: " + cache.get(1));
        cache.put(3, 30);
        System.out.println("Value for key 2: " + cache.get(2));
        cache.put(4, 40);
        System.out.println("Value for key 1: " + cache.get(1));
        System.out.println("Value for key 3: " + cache.get(3));
        System.out.println("Value for key 4: " + cache.get(4));
    }
}

Output:

Value for key 1: 10
Value for key 2: -1
Value for key 1: -1
Value for key 3: 30
Value for key 4: 40

4. Step By Step Explanation

1. The LRUCache class uses a doubly-linked list and a hash map. The doubly-linked list ensures that items can be added or removed in constant time, and the hash map ensures that items can be accessed in constant time.

2. A Node class is used to create nodes of the doubly-linked list. Each node has a key, value, and pointers to the next and previous nodes.

3. The addToHead method is used to add a node to the front of the list.

4. The removeNode method removes a node from the list.

5. The get method retrieves a value from the cache based on a key. If the key exists, the node is removed from its current position and added to the front of the list.

6. The put method either updates a node if the key already exists or adds a new node if the key doesn't exist. If the cache exceeds its capacity, the least recently used node (the one before the tail) is removed.

7. In the main method, different get and put operations are performed on the cache to demonstrate its working.

Note: This implementation offers O(1) time complexity for both the get and put operations.