What is a common optimization approach for transforming large datasets in ETL pipelines?

  • Batch processing
  • Data denormalization
  • Data normalization
  • Stream processing
Batch processing is a common optimization approach for transforming large datasets in ETL pipelines, where data is processed in discrete batches, optimizing resource utilization and throughput.
Add your answer
Loading...

Leave a comment

Your email address will not be published. Required fields are marked *