ollama
not looking for model in remote (VS Code tunnel)
#3552
Labels
area:configuration
Relates to configuration options
ide:vscode
Relates specifically to VS Code extension
kind:bug
Indicates an unexpected problem or unintended behavior
"needs-triage"
Before submitting your bug report
Relevant environment info
Description
I am currently using VSCode tunnel to access a remote cluster. However, the
ollama
tab in theContinue
plugin (installed on cluster) does not connect to theollama
model running on the cluster. When localollama
is on, it connects to my localollama
, which is undesirable due to the lack of GPU compute.Message:
Port info:
To reproduce
No response
Log output
No response
The text was updated successfully, but these errors were encountered: