-
Notifications
You must be signed in to change notification settings - Fork 3.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[R-package] respect 'verbose' argument in lgb.cv() (fixes #4667) #4903
Conversation
R-package/R/lgb.cv.R
Outdated
@@ -333,6 +333,7 @@ lgb.cv <- function(params = list() | |||
|
|||
booster <- Booster$new(params = params, train_set = dtrain) | |||
booster$add_valid(data = dtest, name = "valid") | |||
booster$reset_parameter(params) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
note to reviewers: I don't totally understand why this change fixes #4667. At first I was worried that this function might suffer from a larger problem like "lgb.cv()
does not respect values passed to params
", but that is not true.
Consider the following example. Setting min_gain_to_split
to a large number for the binary objective results in no splits, as expected.
library(lightgbm)
data("agaricus.train")
dtrain <- lgb.Dataset(
data = agaricus.train$data
, label = agaricus.train$label
)
lgb.cv(
params = list(
min_gain_to_split = 100000.0
)
, data = dtrain
, verbose = 0L
, nrounds = 3L
, nfold = 2L
, obj = "binary"
)
The logs are full of the following
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
Changing to min_gain_to_split = 0.0
, those warnings go away and training produces trees with multiple leaves.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe this related to the lack of corresponding code lines in R-package?
LightGBM/python-package/lightgbm/engine.py
Lines 222 to 225 in 2ef3cb8
finally: | |
train_set._reverse_update_params() | |
for valid_set in reduced_valid_sets: | |
valid_set._reverse_update_params() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe? I'm not really sure. The code in lgb.train()
and lgb.cv()
in the R package is very similar, and lgb.train()
doesn't have this issue with the parameter verbose
.
Anyway, I think this fix is still worth merging even if it isn't fully understood...I don't think adding another $update_params()
with the passed-in params introduces any risk.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think adding another
$update_params()
with the passed-in params introduces any risk.
Agree. But this introduces performance degradation because this method calls C API function under the hood and is being executed for each fold.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
but this introduces performance degradation
Sure, but I think that in this case, the cost of calling this method once per fold (one time at set up, not on each iteration, just to be clear) is worth it in exchange for ensuring that lgb.cv()
's behavior is correct.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK, I'm letting you to decide.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok looking into this more closely tonight, I think I figured out:
- why
verbose
is not being respected - why it seems like ONLY
verbose
is not being respected and all other parameters are working correctly
Short Summary
Two bugs in the R package:
Dataset$slice()
callsLGBM_DatasetGetSubset()
, which will globally set the log level to INFO ifverbosity
isn't in the parameters passed to it. Andverbosity
is never passed through.- When
Dataset$update_params(new_params)
is called on aDataset
that has already been constructed, it doesn't storenew_params
in memory in the R object (private$params
)
I just pushed 619eaf4 which reverts the addition of booster$reset_parameter()
and adds fixes for these two bugs.
Much Longer Summary
I was able to reduce the internals of lgb.cv()
to the following code that helps narrow down the issue.
sample code (click me)
library(lightgbm)
data("agaricus.train", package = "lightgbm")
params <- list(
max_bin = 123
, verbose = -7L
)
dtrain <- lgb.Dataset(
data = agaricus.train$data
, label = agaricus.train$label
, params = params
)
dtest <- slice(
dataset = dtrain
, idxset = seq_len(100L)
)
# with a validation set, you get logs
booster_w_valid <- lightgbm:::Booster$new(
params = params
, train_set = dtrain
)
booster_w_valid$add_valid(data = dtest, name = "valid")
booster_w_valid$update()
# without a validation set, no logs!
booster_no_valid <- lightgbm:::Booster$new(
params = params
, train_set = dtrain
)
booster_no_valid$update()
root cause: The use of slice()
on a Dataset
resets LightGBM's log level globally to the default value (INFO).
More details:
Dataset$slice()
gets params fromDataset$get_params()
LightGBM/R-package/R/lgb.Dataset.R
Line 538 in af5b40e
, params = self$get_params()
Dataset$get_params()
filters down to only Dataset-specific parameters.LightGBM/R-package/R/lgb.Dataset.R
Lines 583 to 592 in af5b40e
get_params = function() { dataset_params <- unname(unlist(.DATASET_PARAMETERS())) ret <- list() for (param_key in names(private$params)) { if (param_key %in% dataset_params) { ret[[param_key]] <- private$params[[param_key]] } } return(ret) },
verbose
is not in the Dataset-specific parameters.LightGBM/R-package/R/aliases.R
Line 9 in af5b40e
.DATASET_PARAMETERS <- function() {
Dataset$slice()
eventually callsDataset$construct()
which callsLGBM_DatasetGetSubset()
, which runsconfig.Set()
Line 1407 in af5b40e
config.Set(param);
config.Set()
callsLog::ResetLogLevel()
which resets log level globally for the current process.Lines 241 to 249 in af5b40e
if (verbosity == 1) { LightGBM::Log::ResetLogLevel(LightGBM::LogLevel::Info); } else if (verbosity == 0) { LightGBM::Log::ResetLogLevel(LightGBM::LogLevel::Warning); } else if (verbosity >= 2) { LightGBM::Log::ResetLogLevel(LightGBM::LogLevel::Debug); } else { LightGBM::Log::ResetLogLevel(LightGBM::LogLevel::Fatal); }
- If
verbosity
isn't found inparams
, it defaults to INFO-levelLightGBM/include/LightGBM/config.h
Line 543 in af5b40e
int verbosity = 1;
Dataset.subset()
in the Python package doesn't suffer from this issue because it references self.params
(the dictionary of parameters stored as an attribute on the Dataset
instance) instead of using Dataset.get_params()
, so verbose
is not filtered out.
LightGBM/python-package/lightgbm/basic.py
Line 1860 in af5b40e
params = self.params |
At first, I tried to just fix this by adding "verbosity"
to the list of parameters returned by Dataset$get_params()
. Unfortunately, that can lead to a new type of problem where the value of verbosity
stored in a constructed Dataset will take precedence over values passed to lgb.train()
/ lgb.cv()
if that Dataset is re-used multiple times.
LightGBM/R-package/R/lgb.Booster.R
Line 42 in af5b40e
params <- utils::modifyList(params, train_set$get_params()) |
Next, I tried just changing the use of Dataset$get_params()
to use Dataset$private$params
in Dataset$slice()
. That uncovered another bug. That still suffered from the problem that once the training Dataset was constructed, verbosity was locked in to the value from the last time it was constructed.
- When calling
Dataset$construct()
, if it's already been constructed then R callsLGBM_DatasetUpdateParamChecking()
LightGBM/R-package/R/lgb.Dataset.R
Line 562 in af5b40e
LGBM_DatasetUpdateParamChecking_R
- That call is try-catched. If it fails, then
Dataset$construct()
does the right thing based on whether or not raw data is still available. However, if that call does NOT fail,private$params
is not updated!LightGBM/R-package/R/lgb.Dataset.R
Lines 559 to 577 in af5b40e
} else { tryCatch({ .Call( LGBM_DatasetUpdateParamChecking_R , lgb.params2str(params = private$params) , lgb.params2str(params = params) ) }, error = function(e) { # If updating failed but raw data is not available, raise an error because # achieving what the user asked for is not possible if (is.null(private$raw_data)) { stop(e) } # If updating failed but raw data is available, modify the params # on the R side and re-set ("deconstruct") the Dataset private$params <- utils::modifyList(private$params, params) self$finalize() })
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Despite that I'm not fully satisfied by this fix that causes performance degradation and believe there can be a way of specifying params without resetting them, I don't want to block this PR from merging. Please consider this my approval just as a comment that allows you to click "Merge" button. This isn't a "real" review because I don't understand why this fixes original issue (as you do #4903 (comment)).
Ok thanks @StrikerRUS ! I'm going to try to investigate a bit more before merging this. |
Happy to say I found the root cause for this bug, and was able to revert the additional call to This taught me something important about LightGBM's internals. Any time Lines 241 to 249 in af5b40e
This becomes important considering #4904 (comment)
Any code creating a |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Brilliant investigation! I'm super excited that you've uncovered deep underlying bug. Thank you very much!
Co-authored-by: Nikita Titov <nekit94-08@mail.ru>
This pull request has been automatically locked since there has not been any recent activity since it was closed. To start a new related discussion, open a new issue at https://github.com/microsoft/LightGBM/issues including a reference to this. |
Fixes #4667.
On
master
, the tests added in this PR fail and the code below prints INFO and WARNING level logs.As of this PR, the tests pass and no logs are produced.