Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: ollama modelfile "MESSAGE" is not loaded for librechat but in ollama #3561

Closed
1 task done
MatthiasBergner opened this issue Aug 6, 2024 · 3 comments
Closed
1 task done
Labels
bug Something isn't working

Comments

@MatthiasBergner
Copy link

What happened?

I am encountering different behaviors of the responses in LibreChat and Ollama.
If I use "ollama run DannyGPT" I get the correct answer, but if I select DannyGPT in LibreChat I don´t get the similar answer as in Ollama, but a totally different one. It seems Librechat is cutting off all the "MESSAGE" details from the Modelfile from Ollama, while Ollama is using it, to enhance the knowledge of the loaded model.

(I changed the instructions a bit in favor of the privacy policy of the government I am working for)
I have configured ollama in the librechat.yaml file like this:

# Definition of custom endpoints
endpoints:
  custom:
    - name: "Ollama"
      apiKey: "ollama"
      baseURL: "http://host.docker.internal:11434/v1/"
      models:
        default: [
          "DannyGPT",
        ]
        fetch: false
      titleConvo: true
      titleModel: "current_model"
      summarize: true
      summaryModel: "current_model"
      forcePrompt: true
      modelDisplayLabel: "Ollama"
      userIdQuery: true

I have configured the personalized model like this:

FROM llama3.1:70b

PARAMETER temperature 0.5

PARAMETER num_ctx 32768
PARAMETER repeat_last_n -1

# =================================== Instructions to the Chatbot ===================================
SYSTEM This chatbot is named DannyGPT and is a helpful chatbot.

MESSAGE system Remember this information:
MESSAGE assistant The name of this chatbot is DannyGPT Version llama3.1:70b. Matthias Bergner made the modification to the model and Danny Avila did an awesome job developing LibreChat.

# =================================== Tailored Information ===================================
MESSAGE system Remember this information about the topic Librechat:
MESSAGE user Who developed LibreChat?
MESSAGE assistant Danny Avila did an awesome job developing LibreChat.

When I load the model in Ollama, I see all the MESSAGE prompts during the loading process. After the loading is finished, it is possible to interact with the model about the information stored in the MESSAGE.

I think the model file emulates a previous conversation, and with the configuration, it is possible to access it with Ollama, but LibreChat somehow suppresses this behavior.

When I access the model via LibreChat ony the "SYSTEM" Information is available, but the the information in "MESSAGE".

Steps to Reproduce

  1. Create the Model file with the content I provided
  2. run:
ollama create DannyGPT -f Modelfile
  1. run:
ollama run DannyGPT

4 Ask the following sentence in ollama, you will geht a correct answer:

Who developed LibreChat?
  1. Take the configuration for the librechat.yaml file and run Librechat
  2. Select "DannyGPT"
  3. Ask again "Who developed LibreChat?" in LibreChat (incorrect answer)

What browsers are you seeing the problem on?

No response

Relevant log output

No response

Screenshots

No response

Code of Conduct

  • I agree to follow this project's Code of Conduct
@MatthiasBergner MatthiasBergner added the bug Something isn't working label Aug 6, 2024
@danny-avila
Copy link
Owner

Interesting, thank you for giving an easy reproduction, I will for sure test this.

@MatthiasBergner
Copy link
Author

It seems that it is working right now! Awesome and thank you very much! :) Did you change something, or was it some side-effect of something else? Best greetings :)

@danny-avila
Copy link
Owner

It seems that it is working right now! Awesome and thank you very much! :) Did you change something, or was it some side-effect of something else? Best greetings :)

I changed it so that the system no longer sends default model params, maybe it clashes with your modelfile if temperature is sent, or when some other param is sent.

I must admit that it is a side effect; the changes that might've affected this are: #3663 and #3682

Glad to hear it's working now!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants