Skip to content

Running minilm model local only #90

Answered by michaelfeil
gerritvd asked this question in Q&A
Discussion options

You must be logged in to vote

Hey @gerritvd, thanks for your question. For now I would recommend not using a local_path as the support for this seems complicated with sentence-transformers 2.3.0.

I would resort to one of the following: Use either 0.0.20 as pip or docker image.
https://github.com/michaelfeil/infinity/releases/tag/0.0.20
Or prime the cache by downloading to the default huggingface snapshot_download folder, without specifiying a extra dir.

Replies: 2 comments 5 replies

Comment options

You must be logged in to vote
0 replies
Answer selected by michaelfeil
Comment options

You must be logged in to vote
5 replies
@michaelfeil
Comment options

@gerritvd
Comment options

@michaelfeil
Comment options

@michaelfeil
Comment options

@michaelfeil
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants