[Bug]: ollama modelfile "MESSAGE" is not loaded for librechat but in ollama #3561
Closed
1 task done
Labels
bug
Something isn't working
What happened?
I am encountering different behaviors of the responses in LibreChat and Ollama.
If I use "ollama run DannyGPT" I get the correct answer, but if I select DannyGPT in LibreChat I don´t get the similar answer as in Ollama, but a totally different one. It seems Librechat is cutting off all the "MESSAGE" details from the Modelfile from Ollama, while Ollama is using it, to enhance the knowledge of the loaded model.
(I changed the instructions a bit in favor of the privacy policy of the government I am working for)
I have configured ollama in the librechat.yaml file like this:
I have configured the personalized model like this:
When I load the model in Ollama, I see all the MESSAGE prompts during the loading process. After the loading is finished, it is possible to interact with the model about the information stored in the MESSAGE.
I think the model file emulates a previous conversation, and with the configuration, it is possible to access it with Ollama, but LibreChat somehow suppresses this behavior.
When I access the model via LibreChat ony the "SYSTEM" Information is available, but the the information in "MESSAGE".
Steps to Reproduce
4 Ask the following sentence in ollama, you will geht a correct answer:
What browsers are you seeing the problem on?
No response
Relevant log output
No response
Screenshots
No response
Code of Conduct
The text was updated successfully, but these errors were encountered: