Which activation function maps any input to a value between 0 and 1?
- ReLU
- Sigmoid
- Tanh
- Softmax
The sigmoid activation function maps any input to a value between 0 and 1. It's commonly used in neural networks for binary classification problems and helps introduce non-linearity in the network's computations.
Loading...
Related Quiz
- The process of adjusting the weights in a neural network based on the error rate is known as _______.
- For machine learning model deployment in a production environment, which tool or language is often integrated due to its performance and scalability?
- In Gradient Boosting, what is adjusted at each step to minimize the residual errors?
- When you want to create a complex layered visualization by combining multiple plots, which Python library provides a FacetGrid class?
- In time series forecasting, which method involves using past observations as inputs for predicting future values?