Which Hadoop feature ensures data processing continuity in the event of a DataNode failure?

  • Checkpointing
  • Data Replication
  • Redundancy
  • Secondary NameNode
Data Replication is a key feature in Hadoop that ensures data processing continuity in the event of a DataNode failure. Hadoop replicates data across multiple nodes, and in case one node fails, the processing can seamlessly continue with a replicated copy from another node.
Add your answer
Loading...

Leave a comment

Your email address will not be published. Required fields are marked *