A method used in data cleaning where data points that fall outside of the standard deviation or a set range are removed is called _______.

  • Data Normalization
  • Data Refinement
  • Data Standardization
  • Outlier Handling
Explanation:

In the context of data warehousing, what does the ETL process stand for?

  • Efficient Transfer Logic
  • Enhanced Table Lookup
  • Extract, Transfer, Load
  • Extract, Transform, Load
In data warehousing, ETL stands for "Extract, Transform, Load." This process involves extracting data from source systems, transforming it into a suitable format, and loading it into the data warehouse. Transformation includes data cleansing, validation, and structuring for analytical purposes.

In predictive analytics, what method involves creating a model to forecast future values based on historical data?

  • Descriptive Analytics
  • Diagnostic Analytics
  • Prescriptive Analytics
  • Time Series Forecasting
Time series forecasting is a predictive analytics method that focuses on modeling and forecasting future values based on historical time-ordered data. It is commonly used in various fields, including finance, economics, and demand forecasting.

The methodology that emphasizes a phased approach to deploying ERP solutions, where each phase is a stepping stone for the next, is called _______.

  • Agile Approach
  • Incremental Approach
  • Iterative Approach
  • Waterfall Approach
The methodology that emphasizes a phased approach to deploying ERP solutions, where each phase builds on the previous one, is called the "Incremental Approach." In this approach, each phase is a stepping stone toward achieving the final ERP solution, ensuring a structured and manageable implementation.

An e-commerce company is designing a data model for their sales. They have measures like "Total Sales" and "Number of Items Sold." They want to analyze these measures based on categories like "Product Type," "Brand," and "Region." Which elements in their model would "Product Type," "Brand," and "Region" be considered as?

  • Aggregations
  • Dimensions
  • Fact Tables
  • Measures
"Product Type," "Brand," and "Region" are considered dimensions in the data model. Dimensions are attributes used for analyzing and categorizing data, while measures (like "Total Sales" and "Number of Items Sold") represent the numeric values to be analyzed.

A data scientist notices that a dataset has a few values that are significantly higher than the others, skewing the results. What transformation technique might they consider to stabilize the variances?

  • Log Transformation
  • Min-Max Scaling
  • Outlier Removal
  • Standardization (Z-score normalization)
When dealing with a dataset containing significantly higher values that skew results, log transformation is a valuable technique. It compresses the range of values, making it easier to manage extreme values and stabilize variances. This is particularly useful in scenarios like financial data analysis or when dealing with data with a heavy right-skew.

An organization wants to update its data warehouse with daily sales data. The sales data is vast, but only a small portion changes daily. Which data load approach would be most efficient?

  • Full Load
  • Incremental Load
  • Real-time Load
  • Snapshot Load
For updating a data warehouse with daily sales data where only a small portion changes daily, the most efficient approach is an incremental load. Incremental loading involves only loading the changed or new data, reducing the processing time and system resources required compared to a full load. It is suitable for efficiently updating large datasets with minimal changes.

Which OLAP operation involves viewing the data cube by selecting two dimensions and excluding the others?

  • Dicing
  • Drilling
  • Pivoting
  • Slicing
In OLAP (Online Analytical Processing), the operation of viewing the data cube by selecting two dimensions while excluding others is known as "Dicing." Dicing allows you to focus on specific aspects of the data cube to gain insights into the intersection of chosen dimensions.

Which of the following is NOT typically a function of ETL tools?

  • Data Analysis
  • Data Extraction
  • Data Loading
  • Data Transformation
ETL tools are primarily responsible for data Extraction, Transformation, and Loading (ETL). Data Analysis is typically not a function of ETL tools. Data analysis is performed using BI (Business Intelligence) tools or other analytics platforms after the data has been loaded into the data warehouse.

Which schema design is characterized by a central fact table surrounded by dimension tables?

  • Hierarchical Schema
  • Relational Schema
  • Snowflake Schema
  • Star Schema
A Star Schema is characterized by a central fact table that contains numerical performance measures (facts) and is surrounded by dimension tables that describe the dimensions associated with the facts. This design is commonly used in data warehousing to simplify query performance and reporting.