In a Convolutional Neural Network (CNN), what operation involves reducing the spatial dimensions of the input?
- Pooling (subsampling)
- Convolution
- Batch Normalization
- Activation Function
Pooling (subsampling) is used in CNNs to reduce the spatial dimensions of the input, allowing the network to focus on the most relevant features. It helps control the computational complexity and overfitting.
How does Spark achieve faster data processing compared to traditional MapReduce?
- By using in-memory processing
- By executing tasks sequentially
- By running on a single machine
- By using persistent storage for intermediate data
Apache Spark achieves faster data processing by using in-memory processing. Unlike traditional MapReduce, which writes intermediate results to disk, Spark caches intermediate data in memory, reducing I/O operations and speeding up data processing significantly. This in-memory processing is one of Spark's key features for performance optimization.
EDA often starts with a _______ to get a summary of the main characteristics of a dataset.
- Scatter plot
- Hypothesis test
- Descriptive statistics
- Clustering algorithm
Exploratory Data Analysis (EDA) begins with descriptive statistics to understand the basic characteristics of a dataset, such as mean, median, and standard deviation. These statistics provide an initial overview of the data before diving into more complex analyses.
Which activation function is commonly used in the output layer of a binary classification neural network?
- ReLU (Rectified Linear Activation)
- Sigmoid Activation
- Tanh (Hyperbolic Tangent) Activation
- Softmax Activation
The Sigmoid activation function is commonly used in the output layer of a binary classification neural network. It maps the network's output to a probability between 0 and 1, making it suitable for binary classification tasks. The other activation functions are more commonly used in hidden layers or for other types of problems.
What is one major drawback of using the sigmoid activation function in deep networks?
- Prone to vanishing gradient
- Limited to binary classification
- Efficiently handles negative values
- Non-smooth gradient behavior
One major drawback of using the sigmoid activation function in deep networks is its susceptibility to the vanishing gradient problem. This can hinder training deep networks as the gradient becomes very small for extreme values, slowing down learning.
When normalizing a database in SQL, separating data into two tables and creating a new primary and foreign key relationship is part of the _______ normal form.
- First
- Second
- Third
- Fourth
When normalizing a database, creating a new primary and foreign key relationship by separating data into two tables is part of the Second Normal Form (2NF). 2NF eliminates partial dependencies and ensures that every non-key attribute is functionally dependent on the entire primary key. This is an essential step in achieving a fully normalized database.
In complex ETL processes, _________ can be used to ensure data quality and accuracy throughout the pipeline.
- Data modeling
- Data lineage
- Data profiling
- Data visualization
In complex ETL (Extract, Transform, Load) processes, "Data lineage" is crucial for ensuring data quality and accuracy. Data lineage helps track the origin and transformation of data, ensuring that the data remains reliable and traceable throughout the pipeline.
What does the ROC in AUC-ROC stand for?
- Receiver
- Receiver Operating
- Receiver of
- Receiver Characteristics
AUC-ROC stands for Area Under the Receiver Operating Characteristic curve. The ROC curve is a graphical representation of a model's performance, particularly its ability to distinguish between the positive and negative classes. AUC (Area Under the Curve) quantifies the overall performance of the model, with higher AUC values indicating better discrimination.
The process of using only the architecture of a pre-trained model and retraining it entirely with new data is known as _______ in transfer learning.
- Fine-tuning
- Warm-starting
- Model augmentation
- Zero initialization
Fine-tuning in transfer learning involves taking a pre-trained model's architecture and training it with new data, adjusting the model's parameters to suit the specific task. It's a common technique for leveraging pre-trained models for custom tasks.
In data warehousing, _________ is a technique used to maintain the history of data changes.
- Data Extraction
- Data Transformation
- Data Loading
- Slowly Changing Dimensions (SCD)
Slowly Changing Dimensions (SCD) is a technique used in data warehousing to maintain the history of data changes. It allows the storage of historical data, which is essential for tracking changes and trends over time in a data warehouse.