You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The BestThresholdMetric iterates over all the possible thresholds (or a subset thereof) and uses the threshold that corresponds to the best evaluation. Currently, the used metric is saved as the threshold_´´ property. This drops a lot of information: the evaluation for all the other metrics. To fix this, the class should have two additional properties: thresholds_which stores all the thresholds used for evaluation andevaluations_`` are then the corresponding score for each threshold. This information allows to plot the score in function of the threshold, and thus analyse if the predictions are robust against the chosen threshold or if the threshold should be set carefully.
The text was updated successfully, but these errors were encountered:
In addition, a new method could be added specifically to the BestThresholdMetric, which will fit the best threshold on some given samples. The existing compute_ can then either use this threshold if it exists, or simply iterate over the thresholds if it does not exist.
The
BestThresholdMetric
iterates over all the possible thresholds (or a subset thereof) and uses the threshold that corresponds to the best evaluation. Currently, the used metric is saved as thethreshold_´´ property. This drops a lot of information: the evaluation for all the other metrics. To fix this, the class should have two additional properties:
thresholds_which stores all the thresholds used for evaluation and
evaluations_`` are then the corresponding score for each threshold. This information allows to plot the score in function of the threshold, and thus analyse if the predictions are robust against the chosen threshold or if the threshold should be set carefully.The text was updated successfully, but these errors were encountered: