In caching, __________ refers to the technique of dynamically adjusting the cache size based on current system load.
- Adaptive Caching
- Dynamic Caching
- Incremental Caching
- Static Caching
Adaptive Caching involves dynamically adjusting the cache size based on the current system load, allowing for optimal performance under varying conditions.
In the context of web applications, what does 'cache invalidation' refer to?
- Clearing the browser cache
- Encrypting cached data
- Refreshing cached data
- Storing data in cache
'Cache invalidation' in the context of web applications refers to the process of refreshing or clearing cached data to ensure that users receive the most up-to-date information from the server.
What is the main difference between 'write-through' and 'write-back' caching strategies?
- Write-through and write-back are terms used interchangeably.
- Write-through and write-back caching strategies are essentially the same.
- Write-through involves writing data to both the cache and the underlying storage immediately, while write-back involves writing to the cache first and updating the underlying storage at a later time.
- Write-through involves writing data to the cache only, while write-back involves writing data to both the cache and the underlying storage simultaneously.
The main difference is that 'write-through' immediately updates both the cache and the underlying storage, while 'write-back' involves updating the cache first and delaying the update to the underlying storage.
In distributed systems, what is a cache stampede and how is it typically mitigated?
- A cache stampede is a networking issue that results in delayed cache updates.
- A cache stampede is a planned event in which caches are forcibly cleared and refreshed simultaneously.
- A cache stampede is an outdated concept and is no longer relevant in modern distributed systems.
- A cache stampede is when multiple processes or nodes simultaneously attempt to load the same cache entry that is currently not in the cache.
A cache stampede occurs when multiple processes try to load the same cache entry simultaneously. It is typically mitigated by using cache locks, semaphore mechanisms, or by allowing only one process to regenerate the cache entry.
How does 'cache eviction' differ from 'cache invalidation' in caching strategies?
- Remove all items
- Remove least frequently used items
- Remove least recently used items
- Remove outdated items
In caching strategies, 'cache eviction' involves removing the least recently used or least frequently used items, while 'cache invalidation' is about removing outdated items.
In caching strategies, what is meant by 'cache warming'?
- Caching based on user preferences
- Clearing cache to improve performance
- Dynamically adjusting cache size
- Pre-loading cache with frequently used items
'Cache warming' in caching strategies refers to pre-loading the cache with frequently used items, optimizing performance by ensuring that the cache is populated with data that is likely to be requested.
Which Java interface is typically used for creating custom log messages in a servlet?
- ServletConfig
- ServletContext
- ServletLogger
- ServletRequest
The ServletContext interface is typically used for creating custom log messages in a servlet, providing methods for logging information that can be accessed across the servlet's entire application context.
_________ is a caching technique where frequently and recently accessed data is prioritized for caching.
- FIFO (First In, First Out)
- LRU (Least Recently Used)
- Optimal Replacement
- Random Replacement
LRU (Least Recently Used) is a caching technique where frequently and recently accessed data is prioritized for caching, aiming to keep the most relevant data in the cache.
The strategy of storing only the differences from the main data set in cache is known as __________ caching.
- Delta
- Differential
- Incremental
- Patch
Differential caching involves storing only the differences (or changes) from the main data set in the cache, reducing storage requirements and improving cache efficiency.
A web application implements a caching layer to reduce database load. Over time, the cache starts serving stale data. What caching strategy should be implemented to resolve this?
- Eviction Policies
- Lazy Loading
- Time-to-Live (TTL) caching
- Write-Through Caching
Implementing Time-to-Live (TTL) caching allows data to be cached for a specific duration, after which it is considered stale and refreshed, resolving the issue of serving stale data over time.