Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tools support in ChatCompletion endpoints #2417

Closed
pamelafox opened this issue Feb 8, 2024 · 10 comments
Closed

Tools support in ChatCompletion endpoints #2417

pamelafox opened this issue Feb 8, 2024 · 10 comments
Assignees
Labels
compatibility feature request New feature or request

Comments

@pamelafox
Copy link
Contributor

We'd love tools support so we can use ollama with our existing OpenAI-using apps. Not sure if that's possible across the board with all models.

@carsonkahn-external
Copy link

+1

@jmorganca jmorganca added the feature request New feature or request label Feb 9, 2024
@Tanmaypatil123
Copy link

Does that mean we should implement something similar to OpenAI's Assistant API? Like code interpreter, retreival, etc., right?

@pamelafox
Copy link
Contributor Author

My request is only that the chat completion endpoint support the tools parameter (and related tool_choice):
https://platform.openai.com/docs/api-reference/chat#chat/create-functions

I do see that assistants also support tools, but that would be a much bigger feature.

@maxbaines
Copy link

My request is only that the chat completion endpoint support the tools parameter (and related tool_choice): https://platform.openai.com/docs/api-reference/chat#chat/create-functions

I do see that assistants also support tools, but that would be a much bigger feature.

+1 updated link https://platform.openai.com/docs/api-reference/chat/create#chat-create-tools

@rh-id
Copy link

rh-id commented Mar 7, 2024

+1 Mistral model

https://docs.mistral.ai/guides/function-calling/

@grigio
Copy link

grigio commented Mar 7, 2024

https://docs.mistral.ai/guides/function-calling/

is the tools argument defined in the same way in Mistral and OpenAI ?

# mistral
response = client.chat(model=model, messages=messages, tools=tools, tool_choice="auto")
..
#openai
        response = client.chat.completions.create(
            model=model,
            messages=messages,
            tools=tools,
            tool_choice=tool_choice,
        )

@bmizerany bmizerany self-assigned this Mar 11, 2024
@Benjoyo
Copy link

Benjoyo commented Mar 14, 2024

new function calling model:

https://huggingface.co/NousResearch/Hermes-2-Pro-Mistral-7B

I suggest looking at the llama cpp python implementation for functionary (a chat handler).

@humcqc
Copy link

humcqc commented May 29, 2024

+1

@jboz
Copy link

jboz commented Jun 13, 2024

any updates ? A must to have feature IMO

@jmorganca
Copy link
Member

Thanks for the issue! Merging with #4386!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
compatibility feature request New feature or request
Projects
None yet
Development

No branches or pull requests