Recall: Difference between revisions
Thakshashila (talk | contribs) Created page with "= Recall = '''Recall''' is a metric used in classification to measure how many of the actual positive instances were correctly identified by the model. It is also known as '''sensitivity''' or the '''true positive rate'''. == Definition == :<math> \text{Recall} = \frac{TP}{TP + FN} </math> Where: * '''TP''' = True Positives – correctly predicted positive instances * '''FN''' = False Negatives – actual positives incorrectly predicted as negative Recall answers th..." |
Thakshashila (talk | contribs) |
||
(One intermediate revision by the same user not shown) | |||
Line 44: | Line 44: | ||
* [[F1 Score]] | * [[F1 Score]] | ||
* [[Confusion Matrix]] | * [[Confusion Matrix]] | ||
* [[Sensitivity and Specificity]] | * [[Sensitivity]] and [[Specificity]] | ||
== SEO Keywords == | == SEO Keywords == | ||
recall in machine learning, true positive rate, sensitivity, recall formula, classification evaluation, medical test recall, fraud detection model | recall in machine learning, true positive rate, sensitivity, recall formula, classification evaluation, medical test recall, fraud detection model | ||
[[Category:Artificial Intelligence]] |