For the k-NN algorithm, what could be a potential drawback of using a very large value of kk?
- Increased Model Bias
- Increased Model Variance
- Overfitting to Noise
- Slower Training Time
A potential drawback of using a large value of 'k' in k-NN is that it can overfit to noise in the data, leading to reduced accuracy on the test data.
Deep Q Networks (DQNs) are a combination of Q-learning and what other machine learning approach?
- Convolutional Neural Networks
- Recurrent Neural Networks
- Supervised Learning
- Unsupervised Learning
Deep Q Networks (DQNs) combine Q-learning with Convolutional Neural Networks (CNNs) to handle complex and high-dimensional state spaces.
What distinguishes autoencoders from other traditional neural networks in terms of their architecture?
- Autoencoders have an encoder and decoder
- Autoencoders use convolutional layers
- Autoencoders have more hidden layers
- Autoencoders don't use activation functions
Autoencoders have a distinct encoder-decoder architecture, enabling them to learn efficient representations of data and perform tasks like image denoising and compression.
Consider a scenario where a drone is learning to navigate through a maze. Which reinforcement learning algorithm can be utilized to train the drone?
- Q-Learning
- A* Search
- Breadth-First Search
- Genetic Algorithm
Q-Learning is a reinforcement learning algorithm suitable for training the drone. It allows the drone to learn through exploration and exploitation, optimizing its path in the maze while considering rewards and penalties.
Why is feature selection important in building machine learning models?
- All of the Above
- Enhances Model Interpretability
- Reduces Overfitting
- Speeds up Training
Feature selection is important for various reasons. It reduces overfitting by focusing on relevant features, speeds up training by working with fewer features, and enhances model interpretability by highlighting the most important factors affecting predictions.
Which type of autoencoder is designed specifically for generating data that is similar but not identical to the training data?
- Variational Autoencoder
- Denoising Autoencoder
- Contractive Autoencoder
- Sparse Autoencoder
Variational Autoencoders (VAEs) are designed for generating data that is similar but not identical to the training data. They generate data from a learned distribution, enabling the generation of new and similar data points by sampling from this distribution.
In Q-learning, the update rule involves a term known as the learning rate, represented by the symbol ________.
- Alpha
- Delta
- Sigma
- Theta
In Q-learning, the learning rate is represented by 'alpha.' It controls the step size for updates and impacts the convergence and stability of the learning algorithm.
A financial institution wants to predict whether a loan applicant is likely to default on their loan. They have a mix of numerical data (like income, age) and categorical data (like occupation, marital status). Which algorithm might be well-suited for this task due to its ability to handle both types of data?
- Decision Tree
- Random Forest
- Support Vector Machine
- k-Nearest Neighbors
The Random Forest algorithm is well-suited for this task because it can handle both numerical and categorical data effectively. It combines multiple decision trees and takes a vote to make predictions, making it robust and accurate for such mixed data.
Which of the following RNN variants uses both a forget gate and an input gate to regulate the flow of information?
- LSTM (Long Short-Term Memory)
- GRU (Gated Recurrent Unit)
- Elman Network
- Jordan Network
The LSTM (Long Short-Term Memory) variant uses both a forget gate and an input gate to manage information flow. These gates allow it to control which information to forget or remember, making it highly effective in learning and retaining information over long sequences.
t-SNE is a technique primarily used for what kind of task in machine learning?
- Dimensionality Reduction
- Image Classification
- Anomaly Detection
- Reinforcement Learning
t-SNE (t-distributed Stochastic Neighbor Embedding) is primarily used for dimensionality reduction, reducing high-dimensional data to a lower-dimensional representation for visualization and analysis.