* amended a bug, swapped precision and recall

* added a bit more documentation
This commit is contained in:
delzac 2018-08-10 23:50:42 +08:00 коммит произвёл KeDengMS
Родитель 2a748e1ce9
Коммит b7811d3549
1 изменённых файлов: 5 добавлений и 4 удалений

Просмотреть файл

@ -426,9 +426,10 @@ def fmeasure(output, target, beta=1):
This operation computes the f-measure between the output and target. If beta is set as one,
its called the f1-scorce or dice similarity coefficient. f1-scorce is monotonic in jaccard distance.
f-measure = (1 + bta ** 2) * precision * recall / (beta ** 2 * precision + recall)
f-measure = (1 + beta ** 2) * precision * recall / (beta ** 2 * precision + recall)
This loss function is frequently used in semantic segmentation of images. Works with imbalanced classes too.
This loss function is frequently used in semantic segmentation of images. Works with imbalanced classes, for
balanced classes you should prefer cross_entropy instead.
This operation works with both binary and multiclass classification.
Args:
@ -450,6 +451,6 @@ def fmeasure(output, target, beta=1):
axis = None
correct_predictions = C.reduce_sum(output * target, axis=axis)
precision = correct_predictions / C.reduce_sum(target, axis=axis)
recall = correct_predictions / C.reduce_sum(output, axis=axis)
precision = correct_predictions / C.reduce_sum(output, axis=axis)
recall = correct_predictions / C.reduce_sum(target, axis=axis)
return 1 - (1 + beta ** 2) * precision * recall / (beta ** 2 * precision + recall)