You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
At least, it should be possible to see all performance metrics in validation process. For instance, in 10-fold cross-validation, the program should show us the the metric for each fold.
I'd rather not promote (frequentist) null-hypothesis testing in Orange (or elsewhere). It is not only conceptually wrong per se, but also doesn't fit well into data mining workflow because constantly reformulating your hypotheses invalidates the tests' premises.
Orange version
Expected behavior
A widget should be added to compare model performances, using t-test, wilcoxon signed rank test etc.
Actual behavior
There is no currently available widget for statistical comparison of the model performances
Steps to reproduce the behavior
Additional info (worksheets, data, screenshots, ...)
The text was updated successfully, but these errors were encountered: