A common activation function used in CNNs that helps introduce non-linearity is ________.
- Sigmoid
- ReLU
- Linear
- Tanh
The ReLU (Rectified Linear Unit) activation function is widely used in CNNs for its ability to introduce non-linearity into the model, crucial for learning complex patterns.
Loading...
Related Quiz
- A fashion company wants to create new designs based on current fashion trends. They decide to use machine learning to generate these designs. Which technology would be best suited for this purpose?
- In the context of RNNs, what problem does the introduction of gating mechanisms in LSTMs and GRUs aim to address?
- Which term refers to using a model that has already been trained on a large dataset and fine-tuning it for a specific task?
- What is the primary goal of exploration in reinforcement learning?
- A researcher is working on a medical imaging problem with a limited amount of labeled data. To improve the performance of the deep learning model, the researcher decides to use a model pre-trained on a large generic image dataset. This approach is an example of what?