Lru cache in java

Lru cache in java

This tutorial will introduce you to characteristics of LRU Cache and explain how to implement it in Java

Caching is often leveraged to maintain the data in form of key-value pair in main memory. This results into better performance in comparison to other options such as database storage and file storage. However since main memory is always limited, we need to define the eviction policy to ensure that cache does not grow beyond a certain size.

There are different caching algorithms based on the eviction policies that they offer. Some of these caching algorithms are LRU (Least Recently Used), LFU (Least Frequently Used) and MRU (Most Recently Used). In this tutorial, we will be focusing on LRU cache and see how we can implement it in Java.

Here are the characteristics of LRU cache that distinguishes it from other types of caches —

  1. Maintains fixed number of elements.
  2. Orders the elements based on their usage pattern. Most recently used element always remains on head while least recently used element is stored at tail end.
  3. Removes the element that was least recently used when cache gets full or crosses a threshold ratio (called load factor).
  4. Updates the element’s value if element to be inserted already exists.

We will be using following classes to implement LRU Cache in Java-

  1. HashMap — This will be used to store the elements in Key-Value pair.
  2. Node — This class will be used to create a doubly linked list and act as a node of linked list. All the elements stored in Map will also be maintained in Linked List. Every time, a new element is added, it will be put as head. Likewise, every time an element is accessed, it will be moved to head.
Читайте также:  Java util logging with log4j

Here is the implementation of LRU Cache in Java.

package com.saintech.allprogtutorials.caches; import java.util.HashMap; import java.util.Map; /** * This code is distributed under Apache License 2.0. * * @author Sain Technology Solutions * */ public class LRUCache < private final int capacity; private Map cacheElements = new HashMap<>(); private Node head; private Node tail; public LRUCache(int capacity) < this.capacity = capacity; >/** * Removes the element at tail from Doubly Linked List as well as * cacheElements map. */ private void removeTail() < cacheElements.remove(tail.key); tail = tail.previous; tail.next = null; >/** * Moves the input node to head of the doubly linked list. * * @param node * Node to be moved */ private void moveToHead(Node node) < // If node is already at head, do nothing and simply return if (node == head) < return; >// remove the node from LinkedList node.previous.next = node.next; if (node.next != null) < node.next.previous = node.previous; >else < tail = node.previous; >// put the node at head putAsHead(node); > /** * Puts the input node at head of the doubly linked list. * * @param node * Node to be put on head */ private void putAsHead(Node node) < node.next = head; node.previous = null; if (head != null) < head.previous = node; >head = node; if (tail == null) < tail = head; >> /** * Returns the value corresponding to input key from Cache Map. It also * moves this element to head of doubly linked list since it was most * recently accessed. * * @param key * Key for the element whose value needs to be returned * @return Value of the element with input key or null if no such element * exists */ public V get(K key) < if (cacheElements.containsKey(key)) < final Noden = cacheElements.get(key); moveToHead(n); return n.value; > return null; > /** * Put the element with input key and value in the cache. If the element * already exits, it updates its value. This method also removes the least * recently accessed element if the cache size has reached its capacity. * * @param key * Key of the element * @param value * Value of the element */ public void put(K key, V value) < if (cacheElements.containsKey(key)) < final Nodeold = cacheElements.get(key); old.value = value; moveToHead(old); > else < final Nodecreated = new Node<>(key, value); if (cacheElements.size() >= capacity) < removeTail(); putAsHead(created); >else < putAsHead(created); >cacheElements.put(key, created); > > /** * Implementation of a Node of a Doubly Linked List. * * @author Sain Technology Solutions * * @param * @param */ private static class Node  < K key; V value; Nodenext; Node previous; private Node(K key, V value) < this.key = key; this.value = value; >@Override public String toString() < return "Node Lru cache in java"; >> @Override public String toString() < return cacheElements.toString(); >/** * Entry point for testing LRU Cache. */ public static void main(String[] args) < // Create the cache with capacity 2 LRUCachecache = new LRUCache<>(2); cache.put(2, 1); // Will add an element with key as 2 and value as 1 cache.put(3, 2); // Will add an element with key as 3 and value as 2 // Will add an element with key as 4 and value as 3. Also will remove // the element with key 2 as it was added least recently and cache can // just have two elements at a time cache.put(4, 3); // Since element with key 2 was removed, it will return null System.out.println(cache.get(2)); // It will return value 2 and move the element with key 3 to the head. // After this point, element with key 4 will be least recently accessed System.out.println(cache.get(3)); // Will add an element with key as 5 and value as 4. Also will remove // the element with key 4 as it was accessed least recently and cache // can just have two elements at a time cache.put(5, 4); // Since element with key 2 was removed, it will return null System.out.println(cache.get(4)); > > 

Some of the use cases of LRU cache are as follows –

  1. General Caching — LRU Cache can be used to cache the objects where we want to avoid the calls to database. Using LRU, we always ensure that only the objects that we have used recently remain in cache while getting rid of objects those were not used recently.
  2. Caching Bitmaps in Android — Rendering bitmaps images in Android views can be slow if we load bitmap every time we want to use it. On the other hand, we can’t maintain all the images in memory if there are too many images to render in different views. Therefore we need to keep removing the bitmaps out of memory to avoid out of memory crashes. LRU Cache works very well in this case as it automatically removes the bitmaps that were used least recently.

Thank you for reading through the tutorial. In case of any feedback/questions/concerns, you can communicate same to us through your comments and we shall get back to you as soon as possible.

Источник

Lru cache in java

Learn Latest Tutorials

Splunk tutorial

SPSS tutorial

Swagger tutorial

T-SQL tutorial

Tumblr tutorial

React tutorial

Regex tutorial

Reinforcement learning tutorial

R Programming tutorial

RxJS tutorial

React Native tutorial

Python Design Patterns

Python Pillow tutorial

Python Turtle tutorial

Keras tutorial

Preparation

Aptitude

Logical Reasoning

Verbal Ability

Company Interview Questions

Artificial Intelligence

AWS Tutorial

Selenium tutorial

Cloud Computing

Hadoop tutorial

ReactJS Tutorial

Data Science Tutorial

Angular 7 Tutorial

Blockchain Tutorial

Git Tutorial

Machine Learning Tutorial

DevOps Tutorial

B.Tech / MCA

DBMS tutorial

Data Structures tutorial

DAA tutorial

Operating System

Computer Network tutorial

Compiler Design tutorial

Computer Organization and Architecture

Discrete Mathematics Tutorial

Ethical Hacking

Computer Graphics Tutorial

Software Engineering

html tutorial

Cyber Security tutorial

Automata Tutorial

C Language tutorial

C++ tutorial

Java tutorial

.Net Framework tutorial

Python tutorial

List of Programs

Control Systems tutorial

Data Mining Tutorial

Data Warehouse Tutorial

Javatpoint Services

JavaTpoint offers too many high quality services. Mail us on h[email protected], to get more information about given services.

  • Website Designing
  • Website Development
  • Java Development
  • PHP Development
  • WordPress
  • Graphic Designing
  • Logo
  • Digital Marketing
  • On Page and Off Page SEO
  • PPC
  • Content Development
  • Corporate Training
  • Classroom and Online Training
  • Data Entry

Training For College Campus

JavaTpoint offers college campus training on Core Java, Advance Java, .Net, Android, Hadoop, PHP, Web Technology and Python. Please mail your requirement at [email protected].
Duration: 1 week to 2 week

Like/Subscribe us for latest updates or newsletter RSS Feed Subscribe to Get Email Alerts Facebook Page Twitter Page YouTube Blog Page

Источник

Lru cache in java

This tutorial will introduce you to characteristics of LRU Cache and explain how to implement it using LinkedHashMap in Java

Caching is often leveraged to maintain the data in form of key-value pair in main memory. This results into better performance in comparison to other options such as database storage and file storage. However since main memory is always limited, we need to define the eviction policy to ensure that cache does not grow beyond a certain size.

There are different caching algorithms based on the eviction policies that they offer. Some of these caching algorithms are LRU (Least Recently Used), LFU (Least Frequently Used) and MRU (Most Recently Used). In this tutorial, we will be focusing on LRU cache and see how we can implement it using LinkedHashMap class in Java.

Here are the characteristics of LRU cache that distinguishes it from other types of caches —

  1. Maintains fixed number of elements.
  2. Orders the elements based on their usage pattern. Most recently used element always remains on head while least recently used element is stored at tail end.
  3. Removes the element that was least recently used when cache gets full or crosses a threshold ratio (called load factor).
  4. Updates the element’s value if element to be inserted already exists.

We will be extending LinkedHashMap class provided by Java to implement our LRUCache. LinkedHashMap can order the elements in Insertion order as well as Access order. By default, LinkedHashMap maintains the data in Insertion order. However in this case, we will be configuring LinkedHashMap to maintain the data in Access order by setting the accessOrder flag to true in its three argument copy constructor.

Additionally, we will override method removeEldestEntry that LinkedHashMap calls after every put method call to check whether it should remove the eldest element. In our implementation, we will return true when size becomes greater than the capacity to let LinkedHashMap remove the least recently accessed element.

Here is the implementation of LRU Cache using LinkedHashMap in Java.

package com.saintech.allprogtutorials.caches; import java.util.LinkedHashMap; /** * This code is distributed under Apache License 2.0. * * @author Sain Technology Solutions * */ public class LRUCacheusingLinkedHashMap extends LinkedHashMap  < /** * */ private static final long serialVersionUID = 1L; private final int capacity; @Override protected boolean removeEldestEntry(java.util.Map.Entryeldest) < // Remove the eldest element whenever size of cache exceeds the capacity return (size() >this.capacity); > public LRUCacheusingLinkedHashMap(int capacity) < // Call constructor of LinkedHashMap with accessOrder set to true to // achieve LRU Cache behavior super(capacity + 1, 1.0f, true); this.capacity = capacity; >/** * Returns the value corresponding to input key from Cache Map. * * @param key * Key for the element whose value needs to be returned * @return Value of the element with input key or null if no such element * exists */ public V find(K key) < return super.get(key); >/** * Set the element with input key and value in the cache. If the element * already exits, it updates its value. * * @param key * Key of the element * @param value * Value of the element */ public void set(K key, V value) < super.put(key, value); >/** * Entry point for testing LRU Cache. */ public static void main(String[] args) < // Create the cache with capacity 2 LRUCacheusingLinkedHashMapcache = new LRUCacheusingLinkedHashMap<>( 2); cache.set(2, 1); // Will add an element with key as 2 and value as 1 cache.set(3, 2); // Will add an element with key as 3 and value as 2 // Will add an element with key as 4 and value as 3. Also will remove // the element with key 2 as it was added least recently and cache can // just have two elements at a time cache.set(4, 3); // Since element with key 2 was removed, it will return null System.out.println(cache.find(2)); // It will return value 2 and move the element with key 3 to the head. // After this point, element with key 4 will be least recently accessed System.out.println(cache.find(3)); // Will add an element with key as 5 and value as 4. Also will remove // the element with key 4 as it was accessed least recently and cache // can just have two elements at a time cache.set(5, 4); // Since element with key 2 was removed, it will return null System.out.println(cache.find(4)); > > 

As we can see, this is easiest and reliable way to implement LRU in Java.

Some of the use cases of LRU cache are as follows –

  1. General Caching — LRU Cache can be used to cache the objects where we want to avoid the calls to database. Using LRU, we always ensure that only the objects that we have used recently remain in cache while getting rid of objects those were not used recently.
  2. Caching Bitmaps in Android — Rendering bitmaps images in Android views can be slow if we load bitmap every time we want to use it. On the other hand, we can’t maintain all the images in memory if there are too many images to render in different views. Therefore we need to keep removing the bitmaps out of memory to avoid out of memory crashes. LRU Cache works very well in this case as it automatically removes the bitmaps that were used least recently.

Thank you for reading through the tutorial. In case of any feedback/questions/concerns, you can communicate same to us through your comments and we shall get back to you as soon as possible.

Источник

Оцените статью