-
-
Notifications
You must be signed in to change notification settings - Fork 8.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is it possible to implement multi task gradient boosting in xgboost? #680
Comments
Are you mostly interested in the multivariate/multilabel subset of multi-task learning (when all outcomes for an instance are known), or do you need to be able to deal with more general transfer learning problems (when some outcomes are not known for some instances)? |
@khotilov Actually both. But the first thing I want to figure out is the multivariate/multilabel subset of multi-task learning (when all outcomes for an instance are known). |
Maybe you can try to define your objective function, and use the customized gradient to achieve such feature. Let us move discussion to the roadmap issue |
Suppose I have several related response variables Y with same features, is it possible to minimized overall loss functions from all the response variable Y so that the model can learn the relation between the response variables and make predictions for all the response variables at the same time? Similar to the idea from multi-task neural network?
Does it make sense?
Thanks
The text was updated successfully, but these errors were encountered: