In Hadoop, which framework is traditionally used for batch processing?
- Apache Flink
- Apache Hadoop MapReduce
- Apache Spark
- Apache Storm
In Hadoop, the traditional framework used for batch processing is Apache Hadoop MapReduce. It is a programming model and processing engine that enables the processing of large datasets in parallel across a distributed cluster.
Loading...
Related Quiz
- In Hadoop, the ____ compression codec is often used for its splittable property, allowing efficient parallel processing.
- What is the role of a local job runner in Hadoop unit testing?
- What type of language does Hive use to query and manage large datasets?
- In a scenario involving time-series data storage, what HBase feature would be most beneficial?
- When setting up a new Hadoop cluster in an enterprise, what is a key consideration for integrating Kerberos?