-
Notifications
You must be signed in to change notification settings - Fork 3.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Python] Allowing custom objective / metric function with "objective" and "metric" parameters #3244
Comments
thanks @daanklijn |
Sure. |
@daanklijn Will passing callable metric/loss in standard params will suit your needs?
LightGBM/python-package/lightgbm/sklearn.py Lines 519 to 521 in 78d31d9
|
Closed in favor of being in #2302. We decided to keep all feature requests in one place. Welcome to contribute this feature! Please re-open this issue (or post a comment if you are not a topic starter) if you are actively working on implementing this feature. |
@StrikerRUS , @jameslamb I'm open to develop this I will consider starting first with I think a similar logic as with Loggers can be used, that is, first register the metric (example register_logger) and then making the Inside fobj = fobj or _retrieve_objective() # either passed argument or registered callable
if fobj is not None:
for obj_alias in _ConfigAliases.get("objective"):
params.pop(obj_alias, None)
params['objective'] = 'none' what are your thoughts? |
@TremaMiguel Sorry, I'm not 100% sold on registering custom functions. I believe that allowing to pass callables to |
I agree completely with @StrikerRUS |
Agree I think registering custom callables could bring some unnecessary logic. I've found this issue available on #2302, should it be marked as completed or removed from the list? |
I believe we can wait for some other opinions. Or this issue can be transformed into feature request for allowing custom callables via |
I agree with @StrikerRUS , I'd support passing callables to parameter For more inspiration, the R package also supports passing a mix of strings and custom functions: LightGBM/R-package/tests/testthat/test_basic.R Lines 1412 to 1429 in d31346f
|
I agree with @jameslamb @StrikerRUS. Allowing passing customized function with parameter "objective" or "metric" can also fulfill the need in the motivation of this feature request. I'll change the description in #2302. |
@jameslamb I'm out of context, as I understand |
We are saying we'd support a pull request making it possible to do something like the following: bst = lgb.train(
...,
params = {
"metric": ["auc", _some_function_i_wrote],
}
) And the same for Like the code linked in #3244 (comment). |
@jameslamb @StrikerRUS so basically to achieve bst = lgb.train(
...,
params = {
"objective": "regression" # or custom function "objective":custom_objective_function
"metric": ["auc", _some_function_i_wrote],
}
) one needs to get the following done
is there anything missing? |
I think we can remove |
Firstly I guess we can simply transfer this logic LightGBM/python-package/lightgbm/sklearn.py Lines 668 to 675 in 01568cf
into basic.py
|
@StrikerRUS Ohh I see, I think this one is kind of easy, one just need to handle:
# objective is callable
f_obj = params['objective']
if callable(f_obj):
for obj_alias in _ConfigAliases.get("objective"):
params.pop(obj_alias, None)
fobj = params['objective']
params['objective'] = None
elif isinstance(f_obj, str):
for obj_alias in _ConfigAliases.get("objective"):
params.pop(obj_alias, None) and no need to add a list of P.D: I think there's an improvement with docs regarding |
…unction in params (fixes #3244) (#5052) * feat: support custom metrics in params * feat: support objective in params * test: custom objective and metric * fix: imports are incorrectly sorted * feat: convert eval metrics str and set to list * feat: convert single callable eval_metric to list * test: single callable objective in params Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com> * feat: callable fobj in basic cv function Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com> * test: cv support objective callable Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com> * fix: assert in cv_res Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com> * docs: objective callable in params Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com> * recover test_boost_from_average_with_single_leaf_trees Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com> * linters fail Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com> * remove metrics helper functions Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com> * feat: choose objective through _choose_param_values Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com> * test: test objective through _choose_param_values Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com> * test: test objective is callabe in train Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com> * test: parametrize choose_param_value with objective aliases Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com> * test: cv booster metric is none Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com> * fix: if string and callable choose callable Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com> * test train uses custom objective metrics Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com> * test: cv uses custom objective metrics Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com> * refactor: remove fobj parameter in train and cv Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com> * refactor: objective through params in sklearn API Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com> * custom objective function in advanced_example Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com> * fix whitespackes lint * objective is none not a particular case for predict method Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com> * replace scipy.expit with custom implementation Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com> * test: set num_boost_round value to 20 Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com> * fix: custom objective default_value is none Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com> * refactor: remove self._fobj Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com> * custom_objective default value is None Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com> * refactor: variables name reference dummy_obj Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com> * linter errors * fix: process objective parameter when calling predict Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com> * linter errors * fix: objective is None during predict call Signed-off-by: Miguel Trejo <armando.trejo.marrufo@gmail.com>
Summary
It would be nice if one could register custom objective and loss functions, so that these can be passed into the LightGBM's
train
function via theparam
argument.Motivation
I believe that this would be a nice feature as this allows for easier hyperparameter tuning. Currently one would need to write logic that retrieves the custom function based on the name in a config file and pass those into the train function's
feval
andfobj
arguments. It would be much easier and cleaner if one could just register these custom functions s.t. they can be passed in like the standard objective functions and eval metrics.Description
Currently one would do something like this to use custom objective functions and evaluation metrics:
I would propose to also allow something similar to this:
The text was updated successfully, but these errors were encountered: