Big Data technologies are primarily designed to handle data that exceeds the processing capability of _______ systems.
- Mainframe
- Personal computer
- Supercomputer
- Mobile device
Big Data technologies are specifically designed for data that exceeds the processing capabilities of traditional systems such as mainframes, personal computers, and mobile devices. These traditional systems are not equipped to efficiently process and analyze massive datasets, which is the focus of Big Data technologies.
Data that has some organizational properties, but not as strict as tables in relational databases, is termed as _______ data.
- Unstructured Data
- Semi-Structured Data
- Raw Data
- Big Data
Data that has some organization but doesn't adhere to a strict tabular structure is known as "Semi-Structured Data." It includes data formats like JSON, XML, and others that have a certain level of structure.
While preparing data for a machine learning model, you realize that the 'Height' column has some missing values. Upon closer inspection, you find that these missing values often correspond to records where the 'Age' column has values less than 1 year. What might be a reasonable way to handle these missing values?
- Impute missing values with the mean height
- Impute missing values with 0
- Leave missing values as they are
- Impute missing values based on 'Age'
In this case, it might be reasonable to leave missing values as they are. Imputing with the mean height or 0 may introduce bias, and imputing based on 'Age' should be done carefully, as infants may have different height characteristics than adults. Depending on the context and dataset size, leaving the missing values untouched might be the best choice.
In Gradient Boosting, what is adjusted at each step to minimize the residual errors?
- Learning rate
- Number of trees
- Feature importance
- Maximum depth of trees
In Gradient Boosting, the learning rate (Option A) is adjusted at each step to minimize residual errors. A smaller learning rate makes the model learn more slowly and often leads to better generalization, reducing the risk of overfitting.
The gradient explosion problem in deep learning can be mitigated using the _______ technique, which clips the gradients if they exceed a certain value.
- Data Augmentation
- Learning Rate Decay
- Gradient Clipping
- Early Stopping
Gradient clipping is a technique used to mitigate the gradient explosion problem in deep learning. It limits the magnitude of gradients during training, preventing them from becoming too large and causing instability.
The process of adjusting the contrast or brightness of an image is termed as _______ in image processing.
- Segmentation
- Normalization
- Histogram Equalization
- Enhancement
In image processing, adjusting the contrast or brightness of an image is termed as "Enhancement." Image enhancement techniques are used to improve the visual quality of an image by enhancing specific features such as brightness and contrast.
The process of ________ involves extracting vast amounts of data from different sources and converting it into a format suitable for analysis.
- Data Visualization
- Data Aggregation
- Data Preprocessing
- Data Ingestion
Data Ingestion is the process of extracting vast amounts of data from various sources and converting it into a format suitable for analysis. It is a crucial step in preparing data for analysis and reporting.
Which type of filtering is often used to reduce the amount of noise in an image?
- Median Filtering
- Edge Detection
- Histogram Equalization
- Convolutional Filtering
Median filtering is commonly used to reduce noise in an image. It replaces each pixel value with the median value in a local neighborhood, making it effective for removing salt-and-pepper noise and preserving the edges and features in the image.
Which trend involves using AI to generate high-quality, realistic digital content?
- Data Engineering
- Federated Learning
- Computer Vision and Image Generation
- Generative Adversarial Networks (GANs)
Generative Adversarial Networks (GANs) are used to generate realistic digital content, such as images, videos, and even text. This trend leverages AI to create content that can be nearly indistinguishable from human-generated content, which has applications in various domains.
In the context of Data Science, which tool is most commonly used for data manipulation and analysis due to its extensive libraries and ease of use?
- Excel
- R
- Python
- SQL
Python is commonly used in Data Science for data manipulation and analysis due to its extensive libraries like Pandas and ease of use. It provides a wide range of tools for working with data and is highly versatile for various data analysis tasks.