Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[AutoParallel] Support save model for auto trainer #8927

Merged
merged 1 commit into from
Aug 21, 2024
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 13 additions & 0 deletions paddlenlp/trainer/auto_trainer.py
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,7 @@
MODEL_NAME = "model"
OPTIMIZER_NAME = "optimizer"
DIST_CKPT_PATH = "dist_ckpt"
DIST_MODEL_PATH = "dist_model"

Check warning on line 52 in paddlenlp/trainer/auto_trainer.py

View check run for this annotation

Codecov / codecov/patch

paddlenlp/trainer/auto_trainer.py#L52

Added line #L52 was not covered by tests
FREE_SVAE_LOAD_KEY_PATTERNS = ["learning_rate_", "gradient_merge_", "@GRAD@MERG", "eager_tmp"]


Expand Down Expand Up @@ -552,6 +553,18 @@
with _exec_mode_guard("dynamic"):
super()._maybe_log_save_evaluate(tr_loss, model, epoch, ignore_keys_for_eval, **kwargs)

def _save_model(self):
if not self.args.to_static:
return
with _exec_mode_guard("static"):
output_dir = f"{self.args.output_dir}/{DIST_MODEL_PATH}"
os.makedirs(output_dir, exist_ok=True)
logger.info(f"Saving model files into {output_dir}")
model_file = os.path.join(output_dir, "rank_" + str(paddle.distributed.get_rank()) + ".pd_dist_model")
if os.path.exists(model_file):
os.remove(model_file)
paddle.save(self.model_wrapped.dist_main_program("train"), model_file)

Check warning on line 566 in paddlenlp/trainer/auto_trainer.py

View check run for this annotation

Codecov / codecov/patch

paddlenlp/trainer/auto_trainer.py#L556-L566

Added lines #L556 - L566 were not covered by tests

def _save_checkpoint(self, model, metrics=None):

# Save model checkpoint
Expand Down
Loading