-
Notifications
You must be signed in to change notification settings - Fork 568
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve Copilot + local AI setup instructions #7413
base: master
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you, @azigler!
@@ -82,7 +82,7 @@ Configure a large language model (LLM) for your Copilot integration by going to | |||
|
|||
1. Deploy your model, for example, on `Ollama <https://ollama.com/>`_. | |||
2. Select **OpenAI Compatible** in the **AI Service** dropdown. | |||
3. Enter the URL to your AI service from your Mattermost deployment in the **API URL** field. | |||
3. Enter the URL to your AI service from your Mattermost deployment in the **API URL** field. Be sure to include the port, and append `/v1` to the end of the URL. (e.g., `http://localhost:11434/v1` for Ollama) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
3. Enter the URL to your AI service from your Mattermost deployment in the **API URL** field. Be sure to include the port, and append `/v1` to the end of the URL. (e.g., `http://localhost:11434/v1` for Ollama) | |
3. Enter the URL to your AI service from your Mattermost deployment in the **API URL** field. Be sure to include the port, and append ``/v1`` to the end of the URL. (e.g., ``http://localhost:11434/v1`` for Ollama) |
Newest code from mattermost has been published to preview environment for Git SHA bfa8ac9 |
1 similar comment
Newest code from mattermost has been published to preview environment for Git SHA bfa8ac9 |
@crspeller Should we include a recommendation for disabling tools? Is that a broad recommendation? If so, should we revisit the option and it's default setting? As a minor note, it's a bit confusing that the field is "disable" and has a true/false (which means the radio toggle is semantically opposite from the end result, which might confuse users). What about "Enable Tools" and it has a default setting to whatever you think is most appropriate? |
I noticed this comment from a user that had trouble configuring Copilot with Ollama and realized that the documentation instructions do not include the specific URL instructions to access the OpenAI-compatible API. This PR clarifies that instruction, in line with what's taught in the Academy course.