© 2024 borui. All rights reserved.
This content may be freely reproduced, displayed, modified, or distributed with proper attribution to borui and a link to the article:
borui(2024-03-24 09:39:26 +0000). How to handle zero division when calculating f1score?. https://borui/blog/2024-03-25-en-handle-f1score-zero-division.
@misc{
borui2024,
author = {borui},
title = {How to handle zero division when calculating f1score?},
year = {2024},
publisher = {borui's blog},
journal = {borui's blog},
url={https://borui/blog/2024-03-25-en-handle-f1score-zero-division}
}
this is also asked by someone on stackoverflow:
Precision is defined as:
p = true positives / (true positives + false positives)
What is the value of precision if (true positives + false positives) = 0? Is it just > undefined?
Same question for recall:
r = true positives / (true positives + false negatives)
In this case, what is the value of recall if (true positives + false negatives) = 0?
P.S. This question is very similar to the question What are correct values for precision and recall in edge cases?.
- khatchad. (Mar 8, 2011). Precision is defined as: p.... [Question]. stackoverflow. https://stats.stackexchange.com/questions/8025/what-are-correct-values-for-precision-and-recall-when-the-denominators-equal-0
Acoording to scikit-learn:
F1 is by default calculated as 0.0 when there are notrue positives, false negatives, or false positives.
When true positive + false positive + false negative == 0 (i.e. a class is completely absent from both y_true or y_pred), f-score is undefined. In such cases, by default f-score will be set to 0.0, and UndefinedMetricWarning will be raised. This behavior can be modified by setting the zero_division parameter.
-
questionto42. (Jul 11, 2021). Quote from sklearn.metrics.f1_score in the Notes at the bottom:When true positive + false positive == 0, precision is undefined. When true positive + false negative == 0, recall is undefined. In such cases, by default the metric will be set to 0, as will f-score, and UndefinedMetricWarning will be raised. This behavior can be modified with zero_division.Thus, you cannot avoid this error if your data does not output a difference between true positives and false positives. That being said, you can only suppress the warning at least, adding zero_division=0 to the functions mentioned in the quote. In either case, set to 0 or 1, you will get a 0 value as the return anyway.. [Question]. stackoverflow. https://stackoverflow.com/a/68338538 https://stackoverflow.com/questions/62326735/metrics-f1-warning-zero-division
-
sklearn.metrics.f1_score. (n.d.). scikit-lear. Retrieved April 2, 2024, from https://scikit-learn.org/stable/modules/generated/sklearn.metrics.f1_score.html#sklearn.metrics.f1_score