Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

第七章 fine-tune代码优化 。SSC任务CPU上36小时变2小时 #12

Open
renmengjie7 opened this issue Mar 27, 2022 · 2 comments
Open

Comments

@renmengjie7
Copy link

您好,我发现第七章代码中有处地方能够优化一下。 tokenizer函数中,可以去掉padding='max_length',浪费计算资源。transformer提供的Trainer构造时的data_collator参数默认采用了动态补全的方法,按照batch进行补全,能够节省计算资源。

在我的CPU上跑,时间从36小时变为2小时(没跑完,进度条给的预估时间)

@renmengjie7
Copy link
Author

36变2为ssc任务上的训练时间

@renmengjie7 renmengjie7 changed the title 第七章 fine-tune代码优化 。CPU上36小时变2小时 第七章 fine-tune代码优化 。SSC任务CPU上36小时变2小时 Mar 27, 2022
@ymcui
Copy link
Collaborator

ymcui commented Mar 29, 2022

感谢您的建议,后续会参考进行优化,谢谢!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants