ROC Curve: Difference between revisions
Thakshashila (talk | contribs) Created page with "= ROC Curve = The '''ROC Curve''' ('''Receiver Operating Characteristic Curve''') is a graphical tool used to evaluate the performance of binary classification models. It plots the '''True Positive Rate (TPR)''' against the '''False Positive Rate (FPR)''' at various threshold settings. == Purpose == The ROC Curve shows the trade-off between sensitivity (recall) and specificity. It helps assess how well a classifier can distinguish between two classes. == Definitions..." |
Thakshashila (talk | contribs) |
||
| (2 intermediate revisions by the same user not shown) | |||
| Line 29: | Line 29: | ||
== Ideal ROC Curve == | == Ideal ROC Curve == | ||
* A | * A ''perfect classifier'' reaches the top-left corner (TPR = 1, FPR = 0). | ||
* The | * The ''diagonal line'' (from (0,0) to (1,1)) represents a '''random classifier'''. | ||
* The | * The ''closer the curve is to the top-left'', the better the model. | ||
== Area Under the Curve (AUC) == | == Area Under the Curve (AUC) == | ||
| Line 63: | Line 63: | ||
== Limitations == | == Limitations == | ||
* Can be | * Can be '''overly optimistic''' on highly imbalanced data. | ||
* In such cases, use the [[Precision-Recall Curve]]. | * In such cases, use the [[Precision-Recall Curve]]. | ||
| Line 79: | Line 79: | ||
roc curve in machine learning, what is roc curve, tpr vs fpr, roc curve example, auc roc explained, binary classifier evaluation, model performance threshold, difference between roc and pr curve | roc curve in machine learning, what is roc curve, tpr vs fpr, roc curve example, auc roc explained, binary classifier evaluation, model performance threshold, difference between roc and pr curve | ||
[[Category:Artificial Intelligence]] | |||