Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

For runSeed #16

Open
xb-li opened this issue Jan 10, 2024 · 1 comment
Open

For runSeed #16

xb-li opened this issue Jan 10, 2024 · 1 comment

Comments

@xb-li
Copy link

xb-li commented Jan 10, 2024

I encountered a problem while running the project. I am not sure how to set runSeed in the train function. I hope to receive help. Thank you!
def train(workerId, nWorker, filename, runSeed, args):

if num_processes == 1: train(0, 1, saved_filename)

@Yujia-Yan
Copy link
Owner

Hi the runseed only affect the order of data in each epoch when it's using torch.Distributed (num_processes > 1) for data parallel. Any integer will fine.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants