Java – Avoiding multiple repopulations of the same cache region (due to concurrency)

cachingconcurrencyehcachehibernatejava

I have a high traffic website and I use hibernate. I also use ehcache to cache some entities and queries which are required to generate the pages.

The problem is "parallel cache misses" and the long explanation is that when the application boots and the cache regions are cold each cache region is being populated many times (instead of only once) by different threads because the site is being hit by many users at the same time. In addition, when some cache region invalidates it's being repopulated many times because of the same reason.
How can I avoid this?

I managed to convert 1 entity and 1 query cache to a BlockingCache by providing my own implementation to hibernate.cache.provider_class but the semantics of BlockingCache do not seem to work. Even worst sometimes the BlockingCache deadlocks (blocks) and the application hangs completely. Thread dump shows that processing is blocked on the mutex of BlockingCache on a get operation.

So, the question is, does Hibernate support this kind of use?

And if not, how do you solve this problem on production?

Edit: The hibernate.cache.provider_class points to my custom cache provider which is a copy paste from SingletonEhCacheProvider and at the end of the start() method (after line 136) I do:

Ehcache cache = manager.getEhcache("foo");
if (!(cache instanceof BlockingCache)) {
    manager.replaceCacheWithDecoratedCache(cache, new BlockingCache(cache));
}

That way upon initialization, and before anyone else touches the cache named "foo", I decorate it with BlockingCache. "foo" is a query cache and "bar" (same code but omitted) is an entity cache for a pojo.

Edit 2: "Doesn't seem to work" means that the initial problem still exists. Cache "foo" is still being re-populated many times with the same data, because of the concurrency. I validate this by stressing the site with JMeter with 10 threads. I'd expect the 9 threads to block until the first one which requested data from "foo" to finish it's job (execute queries, store data in cache), and then get the data directly from the cache.

Edit 3: Another explanation for this problem can be seen at https://forum.hibernate.org/viewtopic.php?f=1&t=964391&start=0 but with no definite answer.

Best Answer

I'm not quite sure, but:

It allows concurrent read access to elements already in the cache. If the element is null, other reads will block until an element with the same key is put into the cache.

Doesn't it means that Hibernate would wait until some other thread places the object into cache? That's what you observe, right?

Hib and cache works like this:

  1. Hib gets a request for an object
  2. Hib checks if the object is in cache -- cache.get()
  3. No? Hib loads the object from DB and puts into cache -- cache.put()

So if the object is not in cache (not placed there by some previous update operation), Hib would wait on 1) forever.

I think you need a cache variant where the thread only waits for an object for a short time. E.g. 100ms. If the object is not arrived, the thread should get null (and thus Hibernate will load the object from DB and place into the cache).

Actually, a better logic would be:

  1. Check that another thread is requesting the same object
  2. If true, wait for long (500ms) for the object to arrive
  3. If not true, return null immediately

(We cannot wait on 2 forever, as the thread may fail to put the object into cache -- due to exception).

If BlockingCache doesn't support this behaviour, you need to implement a cache yourself. I did it in past, it's not hard -- main methods are get() and put() (though API apparently has grown since that).

UPDATE

Actually, I just read the sources of BlockingCache. It does exactly what I said -- lock and wait for timeout. Thus you don't need to do anything, just use it...

public Element get(final Object key) throws RuntimeException, LockTimeoutException {
    Sync lock = getLockForKey(key);
    Element element;
        acquiredLockForKey(key, lock, LockType.WRITE);
        element = cache.get(key);
        if (element != null) {
            lock.unlock(LockType.WRITE);
        }
    return element;
}

public void put(Element element) {
    if (element == null) {
        return;
    }
    Object key = element.getObjectKey();
    Object value = element.getObjectValue();

    getLockForKey(key).lock(LockType.WRITE);
    try {
        if (value != null) {
            cache.put(element);
        } else {
            cache.remove(key);
        }
    } finally {
        getLockForKey(key).unlock(LockType.WRITE);
    }
}

So it's kind of strange it doesn't work for you. Tell me something: in your code this spot:

Ehcache cache = manager.getEhcache("foo");

is it synchronized? If multiple requests come at the same time, will there be only one instance of cache?

Related Topic