LRU Cache: Using LinkedHashMap
Java LinkedHashMap and LinkedHashSet maintains the order of insertion or access. By default it’s the insertion order, we can change it using the constructor: LinkedHashMap(int initialCapacity, float loadFactor, boolean accessOrder).
Setting accessOrder to true would make LinkedHashMap sort elements according to the order of access.

LinkedHashMap maintains an extra doubled linked list to maintain the element order: Entry<K,V> header; The header is like a sentinel element, its after points to the oldest element.

LinkedHashMap.removeEldestEntry(Map.Entry<K,V> eldest) allow us to specify whether to remove the oldest one.

class LRUCacheMap<K, V> extends LinkedHashMap<K, V> {
  private static final long serialVersionUID = 1L;
  private int capacity;
  public LRUCacheMap(int capacity) {
    this.capacity = capacity;
  protected boolean removeEldestEntry(java.util.Map.Entry<K, V> eldest) {
    return size() > capacity;

document.write("(adsbygoogle = window.adsbygoogle || []).push({});”)

LRU Cache: Using LinkedHashSet
1. Overwrite add method in LinkedHashSet
When implement a hash based LRU cache, we can extend LinkedHashSet, then overwrite its add method: when current size is equal or larger than MAX_SIZE, use the iterator to remove the first element, than add the new item.

class LRUCacheSet<E> extends LinkedHashSet<E> {
  private static final long serialVersionUID = 1L;
  private int capacity;
  public LRUCacheSet(int capacity) {
    this.capacity = capacity;
  public boolean add(E e) {
    if (size() >= capacity) {
      // here, we can do anything.
      // 1. LRU cache, delete the eldest one(the first one) then add the new
      // item.
      Iterator<E> it = this.iterator();

      // 2. We can do nothing, just return false: this will discard the new
      // item.
      // return false;
    return super.add(e);

2. Using LinkedHashMap to implement LRUCacheSet via Collections.newSetFromMap
HashSet actually uses a HashMap as the backend. Collections.newSetFromMap (introduced in JDK6) allow us to construct a set backed by the specified map. The resulting set displays the same ordering, concurrency, and performance characteristics as the backing map.

So we can use the previous LRUCacheMap as the backing map.

lruCache = Collections.newSetFromMap(new LRUCacheMap<Integer, Boolean>(
for (int i = 0; i < 10; i++) {
Assert.assertArrayEquals(new Integer[] { 5, 6, 7, 8, 9 },

More about Collections.newSetFromMap
There are many HashMap implementations that doesn’t have a corresponding Set implementation: such as ConcurrentHashMap, WeakHashMap, IdentityHashMap. 
Now we can easily create hash instances that work same as ConcurrentHashMap, WeakHashMap:
Set<Object> observers = Collections.newSetFromMap(new ConcurrentHashMap<Object, Boolean>());
Set<Object> weakHashSet = Collections.newSetFromMap(new WeakHashMap<Object, Boolean>());

Using Guava CacheBuilder
Google Guava is a very useful library, and it’s most likely that it’s already included in our project. Guava provides a CacheBuilder which allows us to build a custom cache with different features combination.

LoadingCache<Key, Graph> graphs = CacheBuilder.newBuilder()
.maximumSize(1000).expireAfterWrite(10, TimeUnit.MINUTES)
 new CacheLoader<Key, Graph>() {
   public Graph load(Key key) throws AnyException {
     return createExpensiveGraph(key);

LinkedHashMap’s hidden (?) features
Handy But Hidden: Collections.newSetFromMap()
Guava CacheBuilder

via Blogger http://ift.tt/1e8UF2B