During data loading in Hadoop, what mechanism ensures data integrity across the cluster?

  • Checksums
  • Compression
  • Encryption
  • Replication
Checksums are used during data loading in Hadoop to ensure data integrity across the cluster. Hadoop calculates and verifies checksums for each data block, identifying and handling data corruption issues to maintain the reliability of stored data.
Add your answer
Loading...

Leave a comment

Your email address will not be published. Required fields are marked *