Which variant of RNN is specifically designed to combat the problem of vanishing and exploding gradients?
- LSTM (Long Short-Term Memory)
- GRU (Gated Recurrent Unit)
- Bidirectional RNN
- Simple RNN (Recurrent Neural Network)
Long Short-Term Memory (LSTM) is a variant of RNN that is designed to address the vanishing and exploding gradient problem. LSTMs use specialized gating mechanisms to better capture long-term dependencies in data, making them suitable for sequences with long-term dependencies.
Loading...
Related Quiz
- What is the primary benefit of using ensemble methods in machine learning?
- In NoSQL databases, the absence of a fixed schema means that databases are _______.
- In unsupervised learning, _______ is a method where the objective is to group similar items into sets.
- In a Hadoop ecosystem, which tool is primarily used for data ingestion from various sources?
- A self-driving car company is trying to detect and classify objects around the car in real-time. The team is considering using a neural network architecture that can capture local patterns and hierarchies in images. Which type of neural network should they primarily focus on?