You have a highly imbalanced dataset with rare positive cases. Which performance metric would be the most informative, and why?

  • AUC, as it provides a comprehensive evaluation of the model
  • Accuracy, as it gives overall performance
  • F1-Score, as it balances Precision and Recall
  • Precision, as it focuses on false positives
In a highly imbalanced dataset, F1-Score is often most informative as it balances Precision and Recall. Accuracy might be misleading, and while AUC and Precision are useful, F1-Score provides a better overall sense of how well the model handles both classes.
Add your answer
Loading...

Leave a comment

Your email address will not be published. Required fields are marked *