How do activation functions, like the ReLU (Rectified Linear Unit), contribute to the operation of a neural network?
- They introduce non-linearity into the model
- They reduce the model's accuracy
- They increase model convergence
- They control the learning rate
Activation functions introduce non-linearity to the model, allowing neural networks to approximate complex, non-linear relationships in data. ReLU is popular due to its simplicity and ability to mitigate the vanishing gradient problem.
Loading...
Related Quiz
- When considering a confusion matrix, which metric calculates the harmonic mean of precision and recall?
- Decision Trees often suffer from ______, where they perform well on training data but poorly on new, unseen data.
- In the context of healthcare, what is the significance of machine learning models being interpretable?
- In the context of machine learning, what is the main difference between supervised and unsupervised learning in terms of data?
- When training a robot to play a game where it gets points for good moves and loses points for bad ones, which learning approach would be most appropriate?