When trying to merge the finetuned model on merge adapter not working. #777
Replies: 6 comments 7 replies
-
hi. ive been answering your concerns and others' concerns consistently. but if you decide not to look at docs, and just bash, thats your choice. now, for the final time, read what im writing here:
in the end, its your choice what you want to do and how you want to do it. transformers supports adapter-only models. autotrain is based on transformers and will only support transformers. it will NEVER convert your trained model to GGUF. if you want to convert your trained models to gguf, use this space: https://huggingface.co/spaces/ggml-org/gguf-my-repo |
Beta Was this translation helpful? Give feedback.
-
First of all you didn't answer me before about the solutions. Second, where in the docs can I find all the commands and how to set them up? I have to look others people tutorials to understand how to use it. First of all I'm not using Mac I'm using WSL for Windows. This is what I found it did the trick on youtube of course without knowing the merge adapters thing you are mentioning here now. autotrain llm So, my question is where does here the merge adapters come? Adding --merge_adapters True ?? |
Beta Was this translation helpful? Give feedback.
-
Are there any docs on how to do it manually after training? If I'm right I had to set peft to false and then it would make the trick. So, to make it clear, as the command I provided before just peft to false: autotrain llm Is this correct? And again, if it is possible after being trained without setting it to False so there is no config.json file, How do I make it manually? Appreciate. |
Beta Was this translation helpful? Give feedback.
-
Is it not possible to use the space to merge? It gives always errors. And even trying it on my computer (4090 GPU) it isn't enough. Is it possible to make it work properly on the space? Appreciate. |
Beta Was this translation helpful? Give feedback.
-
Hi @abhishekkrthakur , hope you're well. Did you made the custom merge space? Appreciate. |
Beta Was this translation helpful? Give feedback.
-
Nevermind I've done it myself. |
Beta Was this translation helpful? Give feedback.
-
Writing here because there is no answer to questions from February asking to solve the same error.
Went to this space also form autotrain as it is not possible to convert into .gguf format once finetuned. And as the admin of autotrain says you have to use the merge adapter space in huggingface. You have to duplicate it and use GPU for it.
Even following the short description just got an error wilth no further explanation. And seen other encountering same problem but no response how to solve it.
And it is crucial as the finetune model needs to be converted into .gguf. Quite common.
Anyone facing the same issue.
Beta Was this translation helpful? Give feedback.
All reactions