Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

training loss is high #5

Open
TongWeiDP opened this issue Jan 7, 2021 · 1 comment
Open

training loss is high #5

TongWeiDP opened this issue Jan 7, 2021 · 1 comment

Comments

@TongWeiDP
Copy link

when I use your pre-trained model for training,the initialization loss is high for 200+,what's the question about this? and loss is continuous high during the several epochs,the batch_size is 1,other parameters is not changed,Looking forward to your reply,Thanks!

@atztao
Copy link

atztao commented Jul 20, 2021

the same issue ....

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants