How do activation functions, like the ReLU (Rectified Linear Unit), contribute to the operation of a neural network?

  • They introduce non-linearity into the model
  • They reduce the model's accuracy
  • They increase model convergence
  • They control the learning rate
Activation functions introduce non-linearity to the model, allowing neural networks to approximate complex, non-linear relationships in data. ReLU is popular due to its simplicity and ability to mitigate the vanishing gradient problem.
Add your answer
Loading...

Leave a comment

Your email address will not be published. Required fields are marked *