Which component of the Hadoop ecosystem is primarily used for distributed data storage?

  • HDFS (Hadoop Distributed File System)
  • Apache Spark
  • MapReduce
  • Hive
HDFS (Hadoop Distributed File System) is the primary component in the Hadoop ecosystem for distributed data storage. It is designed to store large files across multiple machines and provides data durability and fault tolerance.
Add your answer
Loading...

Leave a comment

Your email address will not be published. Required fields are marked *