The _______ activation function outputs values between 0 and 1 and can cause a vanishing gradient problem.

  • ReLU
  • Sigmoid
  • Tanh
  • Leaky ReLU
The blank should be filled with "Sigmoid." The Sigmoid activation function maps input values to the range of 0 to 1. It can cause the vanishing gradient problem, which makes training deep networks difficult due to its derivative approaching zero for extreme input values.
Add your answer
Loading...

Leave a comment

Your email address will not be published. Required fields are marked *