Which activation function in neural networks maps its input to values between 0 and 1?
- Leaky ReLU
- ReLU (Rectified Linear Unit)
- Sigmoid
- Tanh (Hyperbolic Tangent)
The activation function that maps its input to values between 0 and 1 is the 'Sigmoid' function. It's commonly used in the output layer of a neural network for binary classification tasks, where it represents probabilities.
Loading...
Related Quiz
- What is the primary purpose of the training phase in machine learning?
- You are a network administrator and receive reports of intermittent connectivity to a cloud-based application. Which tool would you first use to check if the issue is due to packet loss?
- Company wants to analyze images to detect defects in products on an assembly line. Which type of neural network would be most suitable for this task?
- In app development, the method of delivering updates to only a portion of the app's user base to test changes before a full release is known as _______.
- Which wireless networking technology is optimized for low power usage and short-range communications, often used in wearable devices?