Cache

From NovaOrdis Knowledge Base
Jump to navigation Jump to search

Internal

Overview

A cache is a small fast memory. As long as elements fit in the cache, the access is fast, but when the cache reaches its capacity, decisions should be taken as to what element or elements should be evicted to make more room. The optimal caching greedy algorithm is an example of how this problem can be solved.

From a system design perspective, a cache is a component that provides temporary storage to store in memory frequently accessed database data or the results of expensive computations. The goal is to speed up subsequent requests that need the same data. The cache tier is much faster than the database, so using a cache tier improves the system's performance and also reduces the load on the database. The cache tier can be scaled independently.

Consideration for using Caches

  • Caches are most useful when the data is read frequently but modified infrequently.
  • Data in caches is volatile, if the cache instance fails or restarts, the data needs to be reloaded.
  • Data stored in cache should have an expiration policy, otherwise it can grow stale. If the expiration is too short, the cache will reload the data from the source too frequently.
  • Data stored in cache may become inconsistent with the source of record.
  • There should be an eviction policy to deal with the situation when the cache gets full. Eviction policies: Least Recently Used, Least Frequently Used, FIFO.

Products

Organizatorium

To Process: