Which activation function in neural networks maps its input to values between 0 and 1?

  • Leaky ReLU
  • ReLU (Rectified Linear Unit)
  • Sigmoid
  • Tanh (Hyperbolic Tangent)
The activation function that maps its input to values between 0 and 1 is the 'Sigmoid' function. It's commonly used in the output layer of a neural network for binary classification tasks, where it represents probabilities.
Add your answer
Loading...

Leave a comment

Your email address will not be published. Required fields are marked *