Which activation function is commonly used in the output layer of a binary classification neural network?

  • ReLU (Rectified Linear Activation)
  • Sigmoid Activation
  • Tanh (Hyperbolic Tangent) Activation
  • Softmax Activation
The Sigmoid activation function is commonly used in the output layer of a binary classification neural network. It maps the network's output to a probability between 0 and 1, making it suitable for binary classification tasks. The other activation functions are more commonly used in hidden layers or for other types of problems.
Add your answer
Loading...

Leave a comment

Your email address will not be published. Required fields are marked *