Which method in transfer learning involves freezing the earlier layers of a pre-trained model and only training the latter layers for the new task?
- Fine-tuning
- Knowledge Transfer
- Feature Extraction
- Weight Sharing
The method in transfer learning that involves freezing the earlier layers of a pre-trained model and only training the latter layers for the new task is known as fine-tuning. Fine-tuning allows the model to retain the knowledge from the source task while adapting its later layers for the specific requirements of the target task. This approach is common in transfer learning scenarios.
Loading...
Related Quiz
- In the context of binary classification, which metric calculates the ratio of true positives to the sum of true positives and false negatives?
- In terms of neural network architecture, what does the "vanishing gradient" problem primarily affect?
- To prevent overfitting in neural networks, the _______ technique can be used, which involves dropping out random neurons during training.
- Which of the following is not typically a layer in a CNN?
- In time series analysis, what is a sequence of data points measured at successive points in time called?