When planning for disaster recovery, how should a Hadoop administrator prioritize data in different HDFS directories?

  • Prioritize based on access frequency
  • Prioritize based on creation date
  • Prioritize based on file size
  • Prioritize based on replication factor
A Hadoop administrator should prioritize data in different HDFS directories based on the replication factor. Critical data should have a higher replication factor to ensure availability and fault tolerance in the event of node failures.
Add your answer
Loading...

Leave a comment

Your email address will not be published. Required fields are marked *