Micro F1 Score: Difference between revisions

Created page with "= Micro F1 Score = The '''Micro F1 Score''' is an evaluation metric used primarily in '''multi-class''' and '''multi-label classification''' tasks. Unlike Macro F1 Score, it calculates global counts of true positives, false positives, and false negatives across all classes, then uses these to compute a single Precision, Recall, and F1 Score. It is most useful when the dataset is '''imbalanced''' and you care more about overall performance than per-class fai..."
 
Line 75: Line 75:


* [[F1 Score]]
* [[F1 Score]]
* [[Macro F1 Score]]
* [[Macro F1 ]]
* [[Weighted F1 Score]]
* [[Weighted F1]]
* [[Precision]]
* [[Precision]]
* [[Recall]]
* [[Recall]]