ROC Curve
ROC Curve
The ROC Curve (Receiver Operating Characteristic Curve) is a graphical tool used to evaluate the performance of binary classification models. It plots the True Positive Rate (TPR) against the False Positive Rate (FPR) at various threshold settings.
Purpose
The ROC Curve shows the trade-off between sensitivity (recall) and specificity. It helps assess how well a classifier can distinguish between two classes.
Definitions
Where:
- TP = True Positives
- FP = False Positives
- FN = False Negatives
- TN = True Negatives
The ROC curve is generated by plotting TPR vs. FPR for different decision threshold values, typically ranging from 0 to 1.
How It Works
1. A classification model outputs probabilities. 2. These probabilities are converted to class labels using different thresholds. 3. For each threshold, TPR and FPR are computed. 4. Points are plotted to form the ROC curve.
Ideal ROC Curve
- A perfect classifier reaches the top-left corner (TPR = 1, FPR = 0).
- The diagonal line (from (0,0) to (1,1)) represents a random classifier.
- The closer the curve is to the top-left, the better the model.
Area Under the Curve (AUC)
- The ROC AUC score (Area Under the Curve) quantifies overall performance.
- AUC = 1 → Perfect classifier
- AUC = 0.5 → No discriminative power (like random guessing)
Example Use Case
In a medical test to detect cancer:
- A high threshold may miss cancer (low TPR, high specificity).
- A low threshold may raise too many false alarms (high TPR, high FPR).
- The ROC Curve helps decide the optimal threshold balancing both risks.
ROC vs Precision-Recall Curve
Curve Type | Best For |
---|---|
ROC Curve | When classes are balanced or misclassification cost is similar |
Precision-Recall Curve | When positive class is rare (imbalanced datasets) |
Limitations
- Can be overly optimistic on highly imbalanced data.
- In such cases, use the Precision-Recall Curve.
Related Pages
- Sensitivity (TPR)
- Specificity
- F1 Score
- Confusion Matrix
- Precision-Recall Curve
- AUC Score
- Threshold Tuning
SEO Keywords
roc curve in machine learning, what is roc curve, tpr vs fpr, roc curve example, auc roc explained, binary classifier evaluation, model performance threshold, difference between roc and pr curve