Simple example of use as a LRU cache?

404 views
Skip to first unread message

Ian Clarke

unread,
Mar 29, 2011, 6:05:11 PM3/29/11
to concurrentl...@googlegroups.com
I'm struggling to find an example of how to use ConcurrentLinkedHashMap as a simple LRU cache - can anyone provide any pointers?

In fact, I can't find any usage examples at all - which seems like a bit of a documentation oversight, am I missing something obvious?

Ian.

Ben Manes

unread,
Mar 29, 2011, 6:24:11 PM3/29/11
to ConcurrentLinkedHashMap
I might have replied privately by accident.

The JavaDoc has an example and is linked on the project page.

ConcurrentMap<Vertices, Set<Edge>> graph = new Builder<Vertices,
Set<Edge>>()
.weigher(Weighers.<Edge>set())
.maximumWeightedCapacity(5000)
.build();

I'd be happy to write a wiki page with more documentation if desired.

-Ben

Ian Clarke

unread,
Mar 29, 2011, 6:25:05 PM3/29/11
to concurrentl...@googlegroups.com
And so did I - here was my reply:

Ben, as I understand it this example seems intended for a more complicated situation which prioritizes by the size of the sets in the map values.

All I need is a simple LRU cache, last in last out, which should work with arbitrary key-value types, is this possible?  

My guess would be that it would be by-far the most common use-case for a ConcurrentLinkedHashMap.

Ian.
--
Ian Clarke
CEO, SenseArray
Email: i...@sensearray.com
Ph: +1 512 422 3588

Ben Manes

unread,
Mar 29, 2011, 6:30:35 PM3/29/11
to ConcurrentLinkedHashMap
Yes! The weigher is an optional setting, as it defaults to
Weigher.singleton() implementation. You can simply delete that line.

Note that "last in last out" is FIFO. The LRU policy tries to mirror
LinkedHashMap in access-order.

Ben Manes

unread,
Mar 29, 2011, 6:31:39 PM3/29/11
to ConcurrentLinkedHashMap
Wait, I'm tired. Ignore that last comment. :)

On Mar 29, 3:25 pm, Ian Clarke <i...@sensearray.com> wrote:

Ian Clarke

unread,
Mar 29, 2011, 6:35:35 PM3/29/11
to concurrentl...@googlegroups.com, Ben Manes
Ah ok, so just to be sure - this will create an LRU cache with a maximum size of 5,000:

  ConcurrentLinkedHashMap<Integer, Tuple> graph = 
    new Builder<Integer, Tuple>().maximumWeightedCapacity(5000).build();

Ian.

Ben Manes

unread,
Mar 29, 2011, 6:49:02 PM3/29/11
to ConcurrentLinkedHashMap
Yep. I'll write a short tutorial and link it on the project page to
help make this clearer.

P.S. You can also use Google's MapMaker (maximumSize setting), if
preferred. Over time that implementation will mature to be closer to
CLHM's.

Ian Clarke

unread,
Mar 29, 2011, 6:53:12 PM3/29/11
to concurrentl...@googlegroups.com, Ben Manes
Thanks Ben, I didn't realize that Google's collections stuff was this powerful.  In this instance it may suit my requirements better.

Regards,

Ian.

Ben Manes

unread,
Mar 29, 2011, 7:02:21 PM3/29/11
to ConcurrentLinkedHashMap
Yeah, my 20% project has been to incorporate the ideas in this library
into Google's data structures.

Of note, the current version of MapMaker uses per-segment LRUs so it
will evict eagerly. This is especially noticeable for small caches.
CLHM has a single top-level LRU which is the direction we'll be taking
MapMaker, but it was too complicated for the first pass integration.

Ben Manes

unread,
Apr 9, 2011, 1:18:32 AM4/9/11
to ConcurrentLinkedHashMap
FYI,

I added a short tutorial on the API usage.

http://code.google.com/p/concurrentlinkedhashmap/wiki/ExampleUsage
Reply all
Reply to author
Forward
0 new messages