The _______ activation function outputs values between 0 and 1 and can cause a vanishing gradient problem.
- ReLU
- Sigmoid
- Tanh
- Leaky ReLU
The blank should be filled with "Sigmoid." The Sigmoid activation function maps input values to the range of 0 to 1. It can cause the vanishing gradient problem, which makes training deep networks difficult due to its derivative approaching zero for extreme input values.
Loading...
Related Quiz
- In Big Data processing, _______ operations filter and sort data, while _______ operations perform aggregations and transformations.
- RNNs are particularly effective for tasks like _______ because they can retain memory from previous inputs in the sequence.
- An e-commerce platform wants to store the activities and interactions of users in real-time. The data is not structured, and the schema might evolve. Which database is apt for this scenario?
- Which activation function maps any input to a value between 0 and 1?
- You're tasked with performing real-time analysis on streaming data. Which programming language or tool would be most suited for this task due to its performance capabilities and extensive libraries?