In the previous article, I explained caching in general. Now I will speak about using cache in real-life situations and the importance of concurrency.
A classic cache implementation often starts like this:
private readonly Dictionary<string, object> cache = new();
This works only if:
In modern .NET, multiple threads may read/write simultaneously.
Why ConcurrentDictionary is the right default
ConcurrentDictionary<TKey, TValue> is designed for concurrent access without external locking.
private readonly ConcurrentDictionary<string, CacheEntry> _cache = new();
It is Thread-safe for read and write operations, but it does not make the logic around it thread safe.
That’s where expiration and critical sections come in.
Storing time metadata with cached values
To support expiration, we store both:
public sealed record CacheEntry<T>( T Value, DateTimeOffset CreatedAt);
Define a fixed cache duration at startup:
private static readonly TimeSpan CacheDuration = TimeSpan.FromMinutes(5);
Expiration check
private static bool IsExpired(CacheEntry<object> entry)
{
return DateTimeOffset.UtcNow - entry.CreatedAt > CacheDuration;
}
Having an expiration date is essential since it allows us to make sure that our cache is clean and not loaded with unnecessary data.
Also, details like cache duration should, or better be configured in configuration files like appsettings.json.
Now, a simple synchronous cache refresh might look like this:
cache.AddOrUpdate(
key,
_ => new CacheEntry<object>(value, DateTimeOffset.UtcNow),
(_, __) => new CacheEntry<object>(value, DateTimeOffset.UtcNow)
);
This is atomic, but it does not prevent duplicated work when multiple threads compute the same value.
The real problem: duplicated work
Consider this async scenario:
if (!cache.TryGetValue(key, out var entry) || IsExpired(entry))
{
var value = await LoadFromDatabaseAsync();
cache[key] = new CacheEntry<object>(value, DateTimeOffset.UtcNow);
}
If 10 concurrent requests arrive, all 10 see a cache miss, and all 10 calls LoadFromDatabaseAsync()
This is wasteful and sometimes dangerous.
Introducing critical sections with SemaphoreSlim
To solve this, we need a critical section:
Only one request computes and updates the cache, others wait or retry.
SemaphoreSlim is perfect for async code.
private readonly ConcurrentDictionary<string, SemaphoreSlim> locks = new();
One semaphore per cache key.
Acquiring the lock (critical section)
private SemaphoreSlim GetLock(string key)
{
return locks.GetOrAdd(key, _ => new SemaphoreSlim(1, 1));
}
Async lock acquisition with timeout
var semaphore = GetLock(key);
bool acquired = await semaphore.WaitAsync(TimeSpan.FromSeconds(5));
if (!acquired)
{
// Could not enter critical section
throw new TimeoutException("Failed to acquire cache lock.");
}
Once acquired, we are inside the critical section.
Here is a full example we can end up with
private readonly ConcurrentDictionary<string, CacheEntry> cache = new();
private static readonly TimeSpan CacheDuration = TimeSpan.FromMinutes(5);
private readonly ConcurrentDictionary<string, SemaphoreSlim> locks = new();
public async Task<T> GetOrAddAsync<T>(
string key,
Func<Task<T>> factory)
{
if (cache.TryGetValue(key, out var entry) && !IsExpired(entry))
{
return (T)entry.Value;
}
var semaphore = GetLock(key);
bool acquired = await semaphore.WaitAsync(TimeSpan.FromSeconds(5));
if (!acquired)
{
throw new TimeoutException("Cache lock timeout.");
}
try
{
if (cache.TryGetValue(key, out entry) && !IsExpired(entry))
{
return (T)entry.Value;
}
var value = await factory();
cache[key] = new CacheEntry<object>(value,DateTimeOffset.UtcNow);
return value;
}
finally
{
semaphore.Release();
}
}
private static bool IsExpired(CacheEntry<object> entry)
{
return DateTimeOffset.UtcNow - entry.CreatedAt > CacheDuration;
}
private SemaphoreSlim GetLock(string key)
{
return locks.GetOrAdd(key, _ => new SemaphoreSlim(1, 1));
}
public record CacheEntry<T>( T Value, DateTimeOffset CreatedAt);