Running minilm model local only #90
-
I have model data for I noticed that infinity seems to load hub data multiple times
So my thinking was, I need to set:
and
Final docker command:
However, I'm presented with the following error:
For some reason it picks the following path SentenceTransformer#L1062 Where it joins 2 paths, resulting in a path that doesn't exist, after which it calls What am I doing wrong? I should be able to run this backend in offline mode right? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 5 replies
-
Hey @gerritvd, thanks for your question. For now I would recommend not using a local_path as the support for this seems complicated with sentence-transformers 2.3.0. I would resort to one of the following: Use either |
Beta Was this translation helpful? Give feedback.
-
Thanks for responding so quickly. v0.0.20 did indeed work. What changed between ST implementations? |
Beta Was this translation helpful? Give feedback.
Hey @gerritvd, thanks for your question. For now I would recommend not using a local_path as the support for this seems complicated with sentence-transformers 2.3.0.
I would resort to one of the following: Use either
0.0.20
as pip or docker image.https://github.com/michaelfeil/infinity/releases/tag/0.0.20
Or prime the cache by downloading to the default huggingface
snapshot_download
folder, without specifiying a extra dir.