A Threadsafe C# LRUCache Implementation

Introduction

The LRUCache<TKey, TValue> class is a custom implementation of a Least Recently Used (LRU) cache. An LRU cache is a data structure that retains a fixed number of items and evicts the least recently used item when the cache exceeds its capacity. This is particularly useful for scenarios where memory usage needs to be constrained while still keeping frequently accessed data readily available.

The implementation provided offers thread safety ensures efficient eviction, and includes robust resource management. This article will analyze the class, its design decisions, thread safety mechanisms, and other noteworthy aspects.

Why Use an LRU Cache?

The main use case for an LRU cache is to optimize resource usage when dealing with limited memory. Examples include,

  • Caching database query results: Only the most recent results are stored to avoid repeated expensive operations.
  • Image or file caching: Reducing I/O or computation for commonly accessed resources.
  • Web server session management: Limiting sessions stored in memory.

Key Features of This Implementation

  1. Fixed capacity: Enforces an upper limit on the cache size.
  2. Automatic eviction: Least recently used items are removed when capacity is exceeded.
  3. Thread safety: Ensures safe concurrent access in multi-threaded environments.
  4. Disposability: Proper cleanup of disposable items when evicted or when the cache is cleared.

Class Structure and Components

The LRUCache class is built using.

  1. ConcurrentDictionary<TKey, LinkedListNode<(TKey, TValue)>>
    • Provides O(1) lookup for keys in the cache.
    • Maps keys to their corresponding nodes in the LRU list.
  2. LinkedList<(TKey, TValue)>
    • Maintains the ordering of items based on access (most recently used items at the front).
    • Provides O(1) insertion and removal when moving nodes.
  3. Locking Mechanism: A private object _lock is used to synchronize access to the LinkedList and ensure thread safety during modifications.

Key Methods
 

Thread Safety and Locking

While the ConcurrentDictionary ensures thread-safe key lookups and modifications, the LinkedList requires explicit locking when accessed. This dual approach optimizes for the fast-path scenario (key lookup) while ensuring correctness for the slow-path scenario (adding, removing, or reordering nodes).

Example. Thread-Safe Access in Set.

lock (_lock)
{
    if (_cacheMap.TryGetValue(key, out var existingNode))
    {
        _lruList.Remove(existingNode);
        DisposeIfRequired(existingNode.Value.Value);
        existingNode.Value = (key, value);
        _lruList.AddFirst(existingNode);
    }
    else
    {
        var newNode = new LinkedListNode<(TKey Key, TValue Value)>((key, value));
        _lruList.AddFirst(newNode);
        _cacheMap[key] = newNode;
        
        if (_cacheMap.Count > _capacity)
        {
            var lruNode = _lruList.Last;
            if (lruNode != null)
            {
                _lruList.RemoveLast();
                _cacheMap.TryRemove(lruNode.Value.Key, out _);
                DisposeIfRequired(lruNode.Value.Value);
            }
        }
    }
}
  • Locking ensures safe modifications to the LinkedList.
  • Eviction logic removes the least recently used item when the cache exceeds capacity.
  • DisposeIfRequired ensures that disposable resources are cleaned up properly.

Efficient Lookups and Reordering

The TryGetValue method retrieves a value and moves the corresponding node to the front of the LRU list.

if (_cacheMap.TryGetValue(key, out var node))
{
    lock (_lock)
    {
        _lruList.Remove(node);
        _lruList.AddFirst(node);
    }
    value = node.Value.Value;
    return true;
}

Here, the O(1) lookup from the ConcurrentDictionary is combined with O(1) reordering in the LinkedList.

Resource Management

The implementation ensures that resources are released when items are evicted or when the cache is cleared. This is handled through DisposeIfRequired.

private static void DisposeIfRequired(TValue value)
{
    if (value is IDisposable disposable)
    {
        disposable.Dispose();
    }
}

This approach makes the cache compatible with disposable objects, such as file streams or database connections.

Thread-Safe GetOrAdd

The GetOrAdd method allows concurrent threads to retrieve or initialize a value. It uses a double-check locking pattern to avoid race conditions.

if (TryGetValue(key, out TValue? value))
{
    return value;
}
lock (_lock)
{
    if (TryGetValue(key, out value)) // Double-check
    {
        return value;
    }
    value = valueFactory();
    Set(key, value);
    return value;
}

This ensures that the value factory function (valueFactory) is only invoked once for a given key, even if multiple threads attempt to access the same key simultaneously.

Disposal and Cleanup

The Dispose method ensures that the cache can release resources gracefully. It calls Clear() to remove all items and dispose of their resources.

protected virtual void Dispose(bool disposing)
{
    if (_disposedValue)
        return;
    if (disposing)
    {
        Clear();
    }
    _disposedValue = true;
}

The GC.SuppressFinalize(this) prevents unnecessary finalization for managed resources.

Performance Considerations

  1. O(1) Key Lookup: The ConcurrentDictionary enables fast access.
  2. O(1) Reordering: The LinkedList supports efficient node insertion and removal.
  3. Thread Safety: The use of locks minimizes contention while maintaining correctness.
  4. Double-Check Locking: Avoids redundant work during concurrent initialization.

The combination of these factors ensures that the cache performs efficiently, even under high concurrency.

Conclusion

The LRUCache<TKey, TValue> implementation provides a robust, thread-safe caching solution that adheres to LRU semantics. By combining ConcurrentDictionary for fast lookups and LinkedList for efficient reordering, it strikes a balance between performance and simplicity. Additionally, its careful handling of disposable resources makes it suitable for caching objects that require cleanup.

Key Takeaways

  • Thread safety: Achieved through a combination of ConcurrentDictionary and explicit locking.
  • Performance: Optimized for O(1) lookups and reordering.
  • Resource management: Proper disposal of items when evicted or cleared.
  • Robustness: Ensures correct behavior under concurrent access.

This implementation serves as a solid foundation for use cases requiring a fixed-capacity cache with LRU eviction policies.


Similar Articles