Created page with "= Sensitivity = '''Sensitivity''', also known as '''Recall''' or the '''True Positive Rate (TPR)''', is a performance metric used in classification problems. It measures how well a model can identify actual positive instances. == Definition == :<math> \text{Sensitivity} = \frac{TP}{TP + FN} </math> Where: * '''TP''' = True Positives – actual positives correctly predicted * '''FN''' = False Negatives – actual positives incorrectly predicted as negative Sensitivit..."
 
 
Line 62: Line 62:


sensitivity in machine learning, true positive rate, recall vs sensitivity, disease test sensitivity, model evaluation metric, sensitivity formula, confusion matrix sensitivity
sensitivity in machine learning, true positive rate, recall vs sensitivity, disease test sensitivity, model evaluation metric, sensitivity formula, confusion matrix sensitivity
[[Category:Artificial Intelligence]]

Latest revision as of 06:24, 10 June 2025

Sensitivity

Sensitivity, also known as Recall or the True Positive Rate (TPR), is a performance metric used in classification problems. It measures how well a model can identify actual positive instances.

Definition

Sensitivity=TPTP+FN

Where:

  • TP = True Positives – actual positives correctly predicted
  • FN = False Negatives – actual positives incorrectly predicted as negative

Sensitivity answers the question: "Out of all real positive cases, how many did the model correctly identify?"

Alternate Names

  • Recall
  • True Positive Rate (TPR)
  • Hit Rate (in signal detection theory)

Simple Example

A disease test is applied to 100 patients who have the disease. The model predicts:

  • 90 correctly diagnosed as sick → TP = 90
  • 10 wrongly predicted as healthy → FN = 10
Sensitivity=9090+10=90100=0.9=90%

This means the model successfully detected 90% of the sick patients.

Importance of Sensitivity

Sensitivity is **critical** in applications where missing a positive case can have serious consequences.

Real-World Scenarios

  • Medical diagnosis: Missing a disease can be fatal.
  • Security systems: Failing to detect a threat may be dangerous.
  • Fraud detection: Missed fraud cases are costly.

Sensitivity vs Specificity

  • Sensitivity measures how well you find actual positives.
  • Specificity measures how well you avoid false alarms (negatives correctly identified).

Formula for Specificity (for comparison)

Specificity=TNTN+FP

Where:

  • TN = True Negatives
  • FP = False Positives

Related Metrics

  • Recall – Sensitivity is another name for Recall
  • Specificity – Measures true negative rate
  • Precision – Measures correctness of positive predictions
  • F1 Score – Balances Sensitivity and Precision
  • Confusion Matrix – Base table for all classification metrics

SEO Keywords

sensitivity in machine learning, true positive rate, recall vs sensitivity, disease test sensitivity, model evaluation metric, sensitivity formula, confusion matrix sensitivity