Difference Between Node Cache And Redis Cache

What is Caching and why it is required?

When the same response needs to be served multiple times, store the data in a temporary memory cache and it can be retrieved faster than the data retrieved from the storage layer for every single call. Caching is an ability of an application to duplicate the values for a specified period of time and serve the web request instantly.

These techniques will be effective when

  • There is a need to call for a 3rd party API and the call counts
  • Cost of data transfer between cloud and the server.
  • Server response is critical based on the concurrent requests.

Types of server caching available in Node JS

  1. In-app memory Cache
  2. Distributed Cache

In-app memory Cache

Data storage and retrieval will be in the process of the app memory cache. In-app caching is achieved through NodeJS packages such as node-cache, memory-cache, api-cache, etc.,

Distributed Cache

Data storage and retrieval will be carried out in centralized distributed systems. Though many options are available in this category the most popular ones are Redis and Memcached.

Node Cache Redis Cache
Data stored in the process memory Data stored in the disk rather than in RAM
Network call is not required Data retrieval depends on the network call
Multi fold performance improvisation Redis can handle complex operations like Lists, Sets, ordered data set, etc.,
Implementation requires minimal efforts Implementation highlights on memory optimal usage
Lower Cache hit rate Higher Cache hit rate

Detailed implementation of Node Cache

Detailed implementation of Redis Cache

Other aspects of these cache techniques

In the Node Cache, the cached data are accessible only in the current process. If there is a cluster deployed with load balancer then multiple caches need to be stored which leads to constant updating of the data in every system or otherwise the user fetches the wrong information.

Also, if the stored cache is not properly maintained it may result in out of memory failures since the same memory space is utilized. On the other hand if the application is restarted the cached data will be disposed.

Redis can handle the above limitations easily, still it increases the dependency of the system. And that system needs to be monitored constantly not only for the availability but also for the performance degradations, outages.

Though Redis offers features like replication, level of durability, clustering and high availability (HA) but simple node-cache will be easy if the application needs to be scaled up and down easily and quickly.

Static information stored in cache does not require a network call dependency. Node cache is faster to store and retrieve data with a TTL (Time To Live).

Things to consider before choosing the right Caching technique:

Security risks for storing and retaining the data in the cache.

Data object serialization is really required for storing the data.

Prevention of App crash by handling cache failures due to memory leaks or unavailability.

Policy creation for cache maximum size, expiration of the cached memory, frequency of writing data to cache.

Calculating the advantages of using cached data for the application.