In distributed systems, what is a cache stampede and how is it typically mitigated?

  • A cache stampede is a networking issue that results in delayed cache updates.
  • A cache stampede is a planned event in which caches are forcibly cleared and refreshed simultaneously.
  • A cache stampede is an outdated concept and is no longer relevant in modern distributed systems.
  • A cache stampede is when multiple processes or nodes simultaneously attempt to load the same cache entry that is currently not in the cache.
A cache stampede occurs when multiple processes try to load the same cache entry simultaneously. It is typically mitigated by using cache locks, semaphore mechanisms, or by allowing only one process to regenerate the cache entry.

How does 'cache eviction' differ from 'cache invalidation' in caching strategies?

  • Remove all items
  • Remove least frequently used items
  • Remove least recently used items
  • Remove outdated items
In caching strategies, 'cache eviction' involves removing the least recently used or least frequently used items, while 'cache invalidation' is about removing outdated items.

In caching strategies, what is meant by 'cache warming'?

  • Caching based on user preferences
  • Clearing cache to improve performance
  • Dynamically adjusting cache size
  • Pre-loading cache with frequently used items
'Cache warming' in caching strategies refers to pre-loading the cache with frequently used items, optimizing performance by ensuring that the cache is populated with data that is likely to be requested.

What is a 'Content Delivery Network (CDN)' and how does it relate to caching?

  • A mechanism for cache synchronization
  • A protocol for cache communication
  • A type of caching algorithm
  • Network of distributed servers for content delivery
A 'Content Delivery Network (CDN)' is a network of distributed servers designed to deliver content efficiently. It relates to caching by strategically placing content on servers closer to end-users, reducing latency and improving performance through localized caching.

The _________ pattern in caching involves generating and storing the results of a request before it's actually made.

  • Cache-Aside
  • Read-Through
  • Write-Behind
  • Write-Through
The Cache-Aside pattern involves generating and storing the results of a request before it's actually made.

_________ caching is a strategy where each cache stores a subset of the total data set, typically based on geographical location.

  • Distributed
  • Global
  • Local
  • Replicated
Local caching is a strategy where each cache stores a subset of the total data set, typically based on geographical location.

The process of replacing older cache entries with new ones is known as __________.

  • Clear
  • Eviction
  • Purge
  • Swap
The process of replacing older cache entries with new ones is known as Eviction.

In caching, __________ refers to the technique of dynamically adjusting the cache size based on current system load.

  • Adaptive Caching
  • Dynamic Caching
  • Incremental Caching
  • Static Caching
Adaptive Caching involves dynamically adjusting the cache size based on the current system load, allowing for optimal performance under varying conditions.

_________ is a caching technique where frequently and recently accessed data is prioritized for caching.

  • FIFO (First In, First Out)
  • LRU (Least Recently Used)
  • Optimal Replacement
  • Random Replacement
LRU (Least Recently Used) is a caching technique where frequently and recently accessed data is prioritized for caching, aiming to keep the most relevant data in the cache.

The strategy of storing only the differences from the main data set in cache is known as __________ caching.

  • Delta
  • Differential
  • Incremental
  • Patch
Differential caching involves storing only the differences (or changes) from the main data set in the cache, reducing storage requirements and improving cache efficiency.