How does 'cache eviction' differ from 'cache invalidation' in caching strategies?
- Remove all items
- Remove least frequently used items
- Remove least recently used items
- Remove outdated items
In caching strategies, 'cache eviction' involves removing the least recently used or least frequently used items, while 'cache invalidation' is about removing outdated items.
In caching strategies, what is meant by 'cache warming'?
- Caching based on user preferences
- Clearing cache to improve performance
- Dynamically adjusting cache size
- Pre-loading cache with frequently used items
'Cache warming' in caching strategies refers to pre-loading the cache with frequently used items, optimizing performance by ensuring that the cache is populated with data that is likely to be requested.
What is a 'Content Delivery Network (CDN)' and how does it relate to caching?
- A mechanism for cache synchronization
- A protocol for cache communication
- A type of caching algorithm
- Network of distributed servers for content delivery
A 'Content Delivery Network (CDN)' is a network of distributed servers designed to deliver content efficiently. It relates to caching by strategically placing content on servers closer to end-users, reducing latency and improving performance through localized caching.
The _________ pattern in caching involves generating and storing the results of a request before it's actually made.
- Cache-Aside
- Read-Through
- Write-Behind
- Write-Through
The Cache-Aside pattern involves generating and storing the results of a request before it's actually made.
_________ caching is a strategy where each cache stores a subset of the total data set, typically based on geographical location.
- Distributed
- Global
- Local
- Replicated
Local caching is a strategy where each cache stores a subset of the total data set, typically based on geographical location.
The process of replacing older cache entries with new ones is known as __________.
- Clear
- Eviction
- Purge
- Swap
The process of replacing older cache entries with new ones is known as Eviction.
In caching, __________ refers to the technique of dynamically adjusting the cache size based on current system load.
- Adaptive Caching
- Dynamic Caching
- Incremental Caching
- Static Caching
Adaptive Caching involves dynamically adjusting the cache size based on the current system load, allowing for optimal performance under varying conditions.
In the context of web applications, what does 'cache invalidation' refer to?
- Clearing the browser cache
- Encrypting cached data
- Refreshing cached data
- Storing data in cache
'Cache invalidation' in the context of web applications refers to the process of refreshing or clearing cached data to ensure that users receive the most up-to-date information from the server.
What is the main difference between 'write-through' and 'write-back' caching strategies?
- Write-through and write-back are terms used interchangeably.
- Write-through and write-back caching strategies are essentially the same.
- Write-through involves writing data to both the cache and the underlying storage immediately, while write-back involves writing to the cache first and updating the underlying storage at a later time.
- Write-through involves writing data to the cache only, while write-back involves writing data to both the cache and the underlying storage simultaneously.
The main difference is that 'write-through' immediately updates both the cache and the underlying storage, while 'write-back' involves updating the cache first and delaying the update to the underlying storage.
In distributed systems, what is a cache stampede and how is it typically mitigated?
- A cache stampede is a networking issue that results in delayed cache updates.
- A cache stampede is a planned event in which caches are forcibly cleared and refreshed simultaneously.
- A cache stampede is an outdated concept and is no longer relevant in modern distributed systems.
- A cache stampede is when multiple processes or nodes simultaneously attempt to load the same cache entry that is currently not in the cache.
A cache stampede occurs when multiple processes try to load the same cache entry simultaneously. It is typically mitigated by using cache locks, semaphore mechanisms, or by allowing only one process to regenerate the cache entry.