Which activation function is commonly used in the output layer of a binary classification neural network?
- ReLU (Rectified Linear Activation)
- Sigmoid Activation
- Tanh (Hyperbolic Tangent) Activation
- Softmax Activation
The Sigmoid activation function is commonly used in the output layer of a binary classification neural network. It maps the network's output to a probability between 0 and 1, making it suitable for binary classification tasks. The other activation functions are more commonly used in hidden layers or for other types of problems.
Loading...
Related Quiz
- Which dimensionality reduction technique can also be used as a feature extraction method, transforming the data into a set of linearly uncorrelated variables?
- In the Data Science Life Cycle, which step involves defining the objectives and understanding the problem statement?
- In the realm of Data Science, the library _______ in Python is widely used for data manipulation and cleaning.
- In transfer learning, a model trained on a large dataset is used as a starting point, and the knowledge gained is transferred to a new, _______ task.
- What is the most common measure of central tendency, which calculates the average value of a dataset?