Which feature of Hadoop ensures data redundancy and fault tolerance?
- Compression
- Partitioning
- Replication
- Shuffling
Replication is a key feature of Hadoop that ensures data redundancy and fault tolerance. Hadoop replicates data blocks across multiple nodes in the cluster, reducing the risk of data loss in case of node failures and enhancing the system's overall reliability.
Loading...
Related Quiz
- In a custom MapReduce job, what determines the number of Mappers that will be executed?
- In Apache Flume, the ____ is used to extract data from various data sources.
- In a scenario where data skew is impacting a MapReduce job's performance, what strategy can be employed for more efficient processing?
- What is the significance of the 'key-value pair' in Hadoop Streaming API's data processing?
- In the Hadoop ecosystem, ____ is used to enhance batch processing efficiency through resource optimization.