Different Cache Approaches in .NET

Distributed Cache Approach

This would be an approach where an 'Integration Server' would hold the cached data centrally in order for the multiple application servers to communicate.

(It can be anything from Memcache, Ncache to AppFabric etc. depending upon size of the data to be cached. )

The approach can reduce the data sync cost which might have occurred in case of 'In memory Cache' existing across the different servers and is more scalable to handle:

  • Frequent changes made to the database.
  • Large sizes of master data.

There can again be 2 approaches for implementing the same:

  • Request/Load: Loading of cache on a per request basis where the first request would take some time for hitting the API and loading the cache , all subsequent requests would fetch the data directly from cache.

    A TIMER can be set for the cache expiry here after which it can be reloaded for the updated data.

  • Background loading: A window service/ a TIMER based Job can be set in the background to keep loading the data in the cache and for keeping a watch over the updated data based on time stamp.
    The cache can be manually reset in order for reflecting the changes on an immediate basis.

In Memory Approach

It can be another approach assuming  there are no frequent changes in the database and also that the data size would not be going to any enormous levels for any of the clients.

Here the data would be residing on the respective application servers itself and would work more on a ‘Per Request Loading’ basis or Loading it on application start itself.

The expiry TIMERS can be set in order to keep the cache up to date.

The cache can be reset manually by updating a specific file on the basis of a ‘Filedependency’ filter set over the Cache (which works on a timestamp).

The drawbacks to this approach would be:

  • Cache data redundancy.
  • Issue over keeping the different cache’s in sync
  • Problem in handling heavy data.