Skip to content

Commit

Permalink
【AutoParallelism】fix dataloader bug and add ci for static (#8014)
Browse files Browse the repository at this point in the history
* fix dataloader bug and add ci for static

* change dy2st ci

* disable fused_rms_norm

* disable pir-executor

* add loss on V100

* polish

* polish

* delete dy2st ci

* delete dy2st ci
  • Loading branch information
heavyrain-lzy committed Mar 4, 2024
1 parent 77a23d1 commit 17cc169
Showing 1 changed file with 5 additions and 2 deletions.
7 changes: 5 additions & 2 deletions paddlenlp/trainer/auto_trainer.py
Original file line number Diff line number Diff line change
Expand Up @@ -92,9 +92,12 @@ def _get_meshes_for_loader(self):
def _get_mesh(pp_idx=0):
return self.global_mesh.get_mesh_with_dim("pp")[pp_idx]

# Note(lizhiyu): If the values returned by `DataLoader` don't have the format `[images, labels]`,
# error may occurs here.
meshes = []
for pp_idx in range(self.args.pipeline_parallel_degree):
meshes.append(_get_mesh(pp_idx))
meshes.append(_get_mesh(0))
if self.args.pipeline_parallel_degree > 1:
meshes.append(_get_mesh(self.args.pipeline_parallel_degree - 1))
return meshes

def _wrap_for_dist_loader(self, train_dataloader):
Expand Down

0 comments on commit 17cc169

Please sign in to comment.