-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
.Net: Test local modal with Semantic Kernel (i.e., Llama via Ollama) #3990
Comments
Doesn't need to test function calling. |
I tested our IChatCompletionService with Ollama using mistral. My conclusion is that our abstractions are good enough to allow this to work, but it does not currently work due to some implementation details in the Azure OpenAI SDK:
Once these issues are fixed in the Azure OpenAI SDK, our IChatCompletionService will support Ollama. |
Are there issues open on that for Azure.AI.OpenAI? Is anyone working on it? Timeframe? |
hello any update for this? |
Deploy a
IChatCompletionService
Llama model locally with Ollama and validate that it works with Semantic Kernel and the existingIChatCompletionService
interface.The text was updated successfully, but these errors were encountered: