When to use In Memory Caching and Distributed Caching in C#?

Apr 10 2016 1:31 PM
 
In Memory Caching
 
Consistency : While using an in-process cache, your cache elements are local to a single instance of your application. Many medium-to-large applications, however, will not have a single application instance as they will most likely be load-balanced. In such a setting, you will end up with as many caches as your application instances, each having a different state resulting in inconsistency. State may however be eventually consistent as cached items time-out or are evicted from all cache instances.
 
Overheads : very descriptive article describes how an in-process cache can negatively effect performance of an application with an embedded cache primarily due to garbage collection overheads. Your results however are heavily dependent on factors such as the size of the cache and how quickly objects are being evicted and timed-out. 
 
 
Distributed Caching :  
Consistency : Distributed caches, although deployed on a cluster of multiple nodes, offer a single logical view (and state) of the cache. In most cases, an object stored in a distributed cache cluster will reside on a single node in a distributed cache cluster. By means of a hashing algorithm, the cache engine can always determine on which node a particular key-value resides. Since there is always a single state of the cache cluster, it is never inconsistent.
 
Overheads : A distributed cache will have two major overheads that will make it slower than an in-process cache (but better than not caching at all): network latency and object serialization 
 

Answers (2)