You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
OpenAI python client application calls a Spring AI application. The Spring AI Server is configured so that functional calling does not invoke a local (JVM) Java function, but calls back to to the python client application. Consider the case of how this solution looks for the case where there are advisors that would be invoked after the assistant message is returned and function calling is complete.
Current Behavior
Not sure if the current interfaces accommodate this use case.
csterwa
changed the title
Fine-grained client control over function invocation
Spring AI should allow function calling to be proxied for a remove invocation by OpenAI clients
Sep 16, 2024
This commit introduces a new proxyToolCalls option for various chat
models in the Spring AI project. When enabled, it allows the client to
handle function calls externally instead of being processed internally
by Spring AI.
The change affects multiple chat model implementations, including:
AnthropicChatModel
AzureOpenAiChatModel
MiniMaxChatModel
MistralAiChatModel
MoonshotChatModel
OllamaChatModel
OpenAiChatModel
VertexAiGeminiChatModel
ZhiPuAiChatModel
The proxyToolCalls option is added to the respective chat options
classes and integrated into the AbstractToolCallSupport class for
consistent handling across different implementations.
The proxyToolCalls option can be set either programmatically via
the <ModelName>ChatOptions.builder().withProxyToolCalls() method
or the spring.ai.<model-name>.chat.options.proxy-tool-calls
application property.
Documentation for the new option is also updated in the relevant
Antora pages.
Resolvesspring-projects#1367
This commit introduces a new proxyToolCalls option for various chat
models in the Spring AI project. When enabled, it allows the client to
handle function calls externally instead of being processed internally
by Spring AI.
The change affects multiple chat model implementations, including:
AnthropicChatModel
AzureOpenAiChatModel
MiniMaxChatModel
MistralAiChatModel
MoonshotChatModel
OllamaChatModel
OpenAiChatModel
VertexAiGeminiChatModel
ZhiPuAiChatModel
The proxyToolCalls option is added to the respective chat options
classes and integrated into the AbstractToolCallSupport class for
consistent handling across different implementations.
The proxyToolCalls option can be set either programmatically via
the <ModelName>ChatOptions.builder().withProxyToolCalls() method
or the spring.ai.<model-name>.chat.options.proxy-tool-calls
application property.
Documentation for the new option is also updated in the relevant
Antora pages.
Resolvesspring-projects#1367
Expected Behavior
OpenAI python client application calls a Spring AI application. The Spring AI Server is configured so that functional calling does not invoke a local (JVM) Java function, but calls back to to the python client application. Consider the case of how this solution looks for the case where there are advisors that would be invoked after the assistant message is returned and function calling is complete.
Current Behavior
Not sure if the current interfaces accommodate this use case.
Related Issues
#1177, #1032, #656, #652, #368
The text was updated successfully, but these errors were encountered: