-
Notifications
You must be signed in to change notification settings - Fork 101
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
合并权重问题 #34
Comments
(moss) dk@user-SYS-4029GP-TRT2:~/Downloads/MOSS-RLHF-main$ python merge_weight_en.py recover --path_raw /data/dk_downloads/llama-7b-hf --path_diff /data/dk_downloads/sft_model/diff --path_tuned ./models/moss-rlhf-sft-model-7B-en/recover --model_type sft |
上面是报错详细信息 |
+1,我也遇到了 |
已经解决了,把requirements里面的包装一下就可以。命令行运行pip install -r requirements, 里面那个flash-attn可能会报错,我直接把那个删了,希望后续不会有问题。。。 |
您好,这个之前有同学遇到过,确实可能是包的版本问题。本质上是加载模型时候有一些层被加载(例如旋转位置编码),但是在merge的时候不需要用到这些层。 |
flash-attn需要外网环境,我们这里安装时也不太稳定,多安装几次会成功。 |
|
在合并7b-hf与奖励模型,sft模型,policy 模型的时候都报错。
Naive integrity check failed. This could imply that some of the checkpoint files are corrupted.
每个模型我都下载两遍,模型本身应该没问题,应该是哪出了问题。
The text was updated successfully, but these errors were encountered: