In Hadoop, which framework is traditionally used for batch processing?

  • Apache Flink
  • Apache Hadoop MapReduce
  • Apache Spark
  • Apache Storm
In Hadoop, the traditional framework used for batch processing is Apache Hadoop MapReduce. It is a programming model and processing engine that enables the processing of large datasets in parallel across a distributed cluster.
Add your answer
Loading...

Leave a comment

Your email address will not be published. Required fields are marked *