Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

报错了该怎么解决 #368

Open
cwtcwt123 opened this issue Aug 16, 2024 · 9 comments
Open

报错了该怎么解决 #368

cwtcwt123 opened this issue Aug 16, 2024 · 9 comments

Comments

@cwtcwt123
Copy link

在所有环境全部搭建好之后运行web_demo.py然后报错:sat.model.transformer.BaseTransformer() got multiple values for keyword argument 'parallel_output' 这个该怎么解决

@corkiyao
Copy link

在所有环境全部搭建好之后运行web_demo.py然后报错:sat.model.transformer.BaseTransformer() got multiple values for keyword argument 'parallel_output' 这个该怎么解决

更新最新的sat的吧,之前我也遇到这问题。

@cwtcwt123
Copy link
Author

还是不行,我无论是使用最新版本安装还是下载源码安装,最后都会有这个错误

@corkiyao
Copy link

还是不行,我无论是使用最新版本安装还是下载源码安装,最后都会有这个错误

是不是之前加载了模型缓存,我更新之后就没有问题了。

@jiwei08
Copy link

jiwei08 commented Sep 11, 2024

我也遇到了同样的问题,楼主解决了吗

@corkiyao
Copy link

我也遇到了同样的问题,楼主解决了吗

import os
import torch
import argparse

from SwissArmyTransformer.sat import mpu, get_args, get_tokenizer
from SwissArmyTransformer.sat.training.deepspeed_training import training_main
from model import VisualGLMModel
from SwissArmyTransformer.sat.model.finetune import PTuningV2Mixin
from SwissArmyTransformer.sat.model.finetune.lora2 import LoraMixin

你们的在finetune_visualglm.py的import包的时候是这样加载的吗?我这样加载的就没有问题

@jiwei08
Copy link

jiwei08 commented Sep 11, 2024

我直接在python命令行运行

from transformers import AutoTokenizer, AutoModel
model = AutoModel.from_pretrained("~/visualglm-6b", trust_remote_code=True).half().cuda()

也会报错TypeError: sat.model.transformer.BaseTransformer() got multiple values for keyword argument 'parallel_output'

@corkiyao
Copy link

我直接在python命令行运行

from transformers import AutoTokenizer, AutoModel
model = AutoModel.from_pretrained("~/visualglm-6b", trust_remote_code=True).half().cuda()

也会报错TypeError: sat.model.transformer.BaseTransformer() got multiple values for keyword argument 'parallel_output'

官方好像是不更新huggingface的了,只更新SAT模型的了。所以应该是更新了之后,huggingface的模型无法对应版本了

@zykdhr
Copy link

zykdhr commented Sep 20, 2024

我在运行微调脚本时也报了这个错误
1726815501046

解决方法是
pip uninstall SwissArmyTransformer==0.4.12
pip install SwissArmyTransformer==0.4.11
用旧版本的就好了

@corkiyao
Copy link

我在运行微调脚本时也报了这个错误 1726815501046

解决方法是 pip uninstall SwissArmyTransformer==0.4.12 pip install SwissArmyTransformer==0.4.11 用旧版本的就好了

在SAT==0.4.12新版里面吧对应的parallel_output删掉也可以。应该是更新版本的时候,没有删除对应的参数,导致传参的时候有的参数不存在。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants