Thread safety is a huge topic with no single solution. For a cache, I would recommend using a dictionary and a python lock. Something like this:
import threading
# Simple concurrent cache. Fast for existing entries. New entries serialize on the lock.
myCache = {}
myLock = threading.Lock()
def cacheGet(key, constructorFunction):
cached = myCache.get(key)
if cached is None:
myLock.acquire()
try:
cached = myCache.get(key)
if cached is None:
cached = constructorFunction(key)
myCache[key] = cached
finally:
myLock.release()
return cached
Note that there’s no need for a custom class to manage this cache. Just define a function that takes a key and delivers the data to cache. This cacheGet() function is safe to call from many background threads. They won’t block on getting any existing entry, even if another thread is working on a different entry. If two threads try to create an entry for the same key at the same time, the second thread will have to wait on the lock, and will bail out when it gets the lock because the first thread will have set that key in the cache.
The try-finally construct makes sure your lock won’t get stuck if there’s an error in your constructor function.