-
Notifications
You must be signed in to change notification settings - Fork 3.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Save model on every iteration #5178
Comments
Hello @Qashqay, thank you for your interest in LightGBM. This isn't implemented in the library but since there are callbacks this is very easy to achieve if you implement your own callback. Every callback takes a import lightgbm as lgb
import numpy as np
def save_model_callback(env):
env.model.save_model(f'booster_{env.iteration}.txt')
ds = lgb.Dataset(np.random.rand(100, 2), np.random.rand(100))
model = lgb.train({'num_leaves': 10}, ds, num_boost_round=5, callbacks=[save_model_callback]) After running this you should see 5 files in your current directory (one for each iteration). You can modify the logic of that function to only save every x iterations or saving to a different path. Please let us know if this helps. |
Great, thank you. That's exactly what I wanted. |
Great. I'm closing this issue but feel free to reopen if you run into some problem using this. |
running on windows 10: tried above code: def save_model_callback(env): ds = lgb.Dataset(np.random.rand(100, 2), np.random.rand(100)) Error: |
On ubuntu it works! |
@gaebw I suspect that maybe you've built a newer version of the Python code (using Please try building |
This issue has been automatically locked since there has not been any recent activity since it was closed. To start a new related discussion, open a new issue at https://github.com/microsoft/LightGBM/issues including a reference to this. |
Hello,
Is it possible to save intermediate model on every boosting iteration with the standard
train
function?For example:
and in
/tmp/
we will have 10 files with identical tomodel.save_model('/tmp/model.txt')
format.Thank you
The text was updated successfully, but these errors were encountered: