Post Training Usage #606
Unanswered
tushar-31093
asked this question in
Q&A
Replies: 1 comment
-
try this, https://huggingface.co/spaces/ggml-org/gguf-my-repo for qa, you can use llm sft or orpo tasks. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi Team,
I had a quick question. I wanted to fine-tune phi-3 mini over our custom QA data and then port it into an android variant. We are yet to reach that stage but my question is from the stage we finish finetuning. After finetuning and pushing the model to HF HUB. Is there a way to auto convert it to quantized state or gguf format? If yes, please could you recommend how. I am bit all over the place with the documentation.
Also, any headups for the app porting in case you know which type of model format could fit into this scope? We will try with mediapipe as it seems explicit but how a finetuned version comes into picture, that is a question.
Look forward to your guidance.
Beta Was this translation helpful? Give feedback.
All reactions