Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Enhancement] Add default ranges for hyperparameter tuning #6034

Open
rohan-gt opened this issue Aug 19, 2020 · 2 comments
Open

[Enhancement] Add default ranges for hyperparameter tuning #6034

rohan-gt opened this issue Aug 19, 2020 · 2 comments

Comments

@rohan-gt
Copy link

Is it possible to add good default conservative ranges for each of the hyperparameters for tuning purposes in the documentation? By this I mean something like [0, 100] instead of [0, Infinity] based on results from testing on a lot of different datasets so that these ranges can directly be used for tuning without worrying about what the right values should be each time?

Another alternative could be to embed this info into the algorithm class itself so that HPO packages can directly pick it up

@trivialfis
Copy link
Member

There's #4986 , can that help?

@rohan-gt
Copy link
Author

@trivialfis that issue is about static default parameters. I'm looking for default parameter ranges which can then be tuned using an optimization algorithm like BOHB

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants