-
Notifications
You must be signed in to change notification settings - Fork 418
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
报错了该怎么解决 #368
Comments
更新最新的sat的吧,之前我也遇到这问题。 |
还是不行,我无论是使用最新版本安装还是下载源码安装,最后都会有这个错误 |
是不是之前加载了模型缓存,我更新之后就没有问题了。 |
我也遇到了同样的问题,楼主解决了吗 |
import os
import torch
import argparse
from SwissArmyTransformer.sat import mpu, get_args, get_tokenizer
from SwissArmyTransformer.sat.training.deepspeed_training import training_main
from model import VisualGLMModel
from SwissArmyTransformer.sat.model.finetune import PTuningV2Mixin
from SwissArmyTransformer.sat.model.finetune.lora2 import LoraMixin 你们的在finetune_visualglm.py的import包的时候是这样加载的吗?我这样加载的就没有问题 |
我直接在python命令行运行
也会报错 |
官方好像是不更新huggingface的了,只更新SAT模型的了。所以应该是更新了之后,huggingface的模型无法对应版本了 |
在所有环境全部搭建好之后运行web_demo.py然后报错:sat.model.transformer.BaseTransformer() got multiple values for keyword argument 'parallel_output' 这个该怎么解决
The text was updated successfully, but these errors were encountered: