A common activation function used in CNNs that helps introduce non-linearity is ________.

  • Sigmoid
  • ReLU
  • Linear
  • Tanh
The ReLU (Rectified Linear Unit) activation function is widely used in CNNs for its ability to introduce non-linearity into the model, crucial for learning complex patterns.
Add your answer
Loading...

Leave a comment

Your email address will not be published. Required fields are marked *