We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
seems to be related to #5349, but was not fixed by the fix
script to reproduce:
import xgboost as xgb dtrain = xgb.DMatrix("matrix.part0") param = { 'booster': "gbtree", 'tree_method': "approx", 'max_depth': 6, "objective": "reg:gamma", "lambda": 1.0, 'gamma': 0.0, 'nthread': 16 } num_round = 50 bst = xgb.train(param, dtrain, num_round, [(dtrain, "train")])
script output:
[0] train-gamma-nloglik:nan [1] train-gamma-nloglik:nan [2] train-gamma-nloglik:nan [3] train-gamma-nloglik:nan [4] train-gamma-nloglik:nan [5] train-gamma-nloglik:nan [6] train-gamma-nloglik:nan [7] train-gamma-nloglik:nan [8] train-gamma-nloglik:nan [9] train-gamma-nloglik:nan ...
data and script attached nan_metrics.zip
tested with: 1.2.0 and 1.3.3, same result
The text was updated successfully, but these errors were encountered:
I need to restore the estimation on dispersion.
Sorry, something went wrong.
For now please use gamma deviance instead, it should be a good metric as the objective is actually derived from it with log link.
It didn't help. I need to check the data.
Your data contains 0. You need to do some per-processing to remove them.
trivialfis
Successfully merging a pull request may close this issue.
seems to be related to #5349, but was not fixed by the fix
script to reproduce:
script output:
data and script attached
nan_metrics.zip
tested with: 1.2.0 and 1.3.3, same result
The text was updated successfully, but these errors were encountered: