Popular Metrics
- Accuracy: The ratio of correctly predicted instances to the total instances.
- Precision: The ratio of true positive predictions to the sum of true positive and false positive predictions.
- Recall: The ratio of true positive predictions to the sum of true positive and false negative predictions.
- F1 Score: The harmonic mean of precision and recall, useful for imbalanced datasets.
AUC-ROC and Beyond
The Area Under the Receiver Operating Characteristic (AUC-ROC) curves plot true positive rates against false positive rates, providing insight into model separability.
Advanced metrics also include Log Loss and Confusion Matrix interpretations, which are pivotal in threshold analyses.
Want to Learn More?
Try these resources for further exploration: