-
-
Notifications
You must be signed in to change notification settings - Fork 8.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
custom loss functions #2522
Comments
@jeremyhermann Currently, the Python binding supports custom loss function. The format will look like def custom_loss(preds, dtrain):
... do some gory computation ...
return grad, hess
# at training time
bst = xgboost.train(params, dtrain, 10, eval_list, obj=custom_loss) where Things get tricky when your loss function is not twice differentiable. For now, you should look at #1825. Basically, what we need is to find
For twice-differentiable objectives, For next week or two, I plan to do some self-study on gradient boosting. I will get back to you if I have better idea about supporting non-smooth functions. |
It would be really nice if huber loss was supporting natively in xgboost. Implementing it in python works but it significantly slows the learning process. |
@adamwlev you may use a pseudo huber loss https://en.wikipedia.org/wiki/Huber_loss which is differentiable. |
@pommedeterresautee Thanks, I am actually using this and I like it. The only thing is that training is 10-15% slower since the objective is implemented in python-numpy not native to xgboost. Unfortunately, I am intimated by the code base of xgboost/don't have the to invest in learning enough to be able to make this contribution. |
We are assessing the viability of moving a number of our modeling projects to xgboost and wanted to understand the feasibility of supporting several new loss functions, using the existing plugin system and/or making modifications to xgboost source code.
Do you think we can get these to work?
If these require changes to xgboost source, we'd be interested in contributing back.
The text was updated successfully, but these errors were encountered: