-
Notifications
You must be signed in to change notification settings - Fork 157
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add sample-weight interface point? #177
Comments
Dare I say ... tasks? Of course one can also just add it to signatures wherever an X appears, and check whether row lengths agree. |
I'm thinking:
As far as I can tell this breaks nothing. Note the length of If people want to discuss contriving sample-weight support for non-supporting models (oversampling, and so forth), please open a new thread. A test case already exists: the SVM models at https://github.com/alan-turing-institute/MLJModels.jl/blob/master/src/LIBSVM.jl supports weights (currently weights passed as a hyperparameter). |
Small comment: I think the default/baseline case for the weighted version should be "ignore the weights". That way, every learner would be able to take weights as additional input, which should make building of learners easier and avoid an interface case distinction. |
If you want oversampling (and so forth), I'd consider these reduction strategies, hence wrappers (or more generally, first-order compositors). |
@fkiraly I like your suggestion to make the fit signature the same for alll cases ( |
Evaluation (as opposed to training) now supports per-observation weights #206 |
Resolved a while ago. |
Some supervised and unsupervised algorithms allow one to weight instances and currently there is no obvious interface for passing this information to theses algorithms
The text was updated successfully, but these errors were encountered: