Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Custom Endpoint for OpenAI #2145

Closed
victor-v-rad opened this issue Jul 23, 2023 · 9 comments
Closed

Custom Endpoint for OpenAI #2145

victor-v-rad opened this issue Jul 23, 2023 · 9 comments
Labels

Comments

@victor-v-rad
Copy link

victor-v-rad commented Jul 23, 2023

Companies create internal API proxies to avoid direct calls to https://api.openai.com/. The Semantic Kernel does not provide the possibility to specify a custom endpoint. Please see the picture.
image

Please create this possibility.

@victor-v-rad
Copy link
Author

Created according issue for Azure SDK
Azure/azure-sdk-for-net#37794

@gingters
Copy link

Just to second that, if you want to use Semantic Kernel to enrich your Blazor WebAssembly application with AI capabilities, you for sure don't want to provide your API key to the browser ;)

It's easy to add a Yarp.ReverseProxy to your server that authenticatces your users based on your normal auth mode (Cookies, Tokens etc.) redirects all calls to /openai to your Azure OpenAI endpoints and add the secret Api Key to these requests on the server side, but since you can't set the base url for the pure OpenAI endpoint its sadly not possible to use these from your web application atm.

@JadynWong
Copy link
Contributor

JadynWong commented Jul 25, 2023

Temporary solution

#1445 (comment)

#1445 (comment)

private class ProxyOpenAIHandler : HttpClientHandler
{
    protected override Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
    {
        if (request.RequestUri != null && request.RequestUri.Host.Equals("api.openai.com", StringComparison.OrdinalIgnoreCase))
        {
            // your proxy url
            request.RequestUri = new Uri($"https://openai.yourproxy.com/{request.RequestUri.PathAndQuery}");
        }
        return base.SendAsync(request, cancellationToken);
    }
}


var kernel = Kernel.Builder
        .WithOpenAIChatCompletionService("<model-id>", "<api-key>", httpClient: new HttpClient(new ProxyOpenAIHandler()))
        .Build();

@craigomatic
Copy link
Contributor

Temporary solution

#1445 (comment)

#1445 (comment)

private class ProxyOpenAIHandler : HttpClientHandler
{
    protected override Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
    {
        if (request.RequestUri != null && request.RequestUri.Host.Equals("api.openai.com", StringComparison.OrdinalIgnoreCase))
        {
            // your proxy url
            request.RequestUri = new Uri($"https://openai.yourproxy.com/{request.RequestUri.PathAndQuery}");
        }
        return base.SendAsync(request, cancellationToken);
    }
}


var kernel = Kernel.Builder
        .WithOpenAIChatCompletionService("<model-id>", "<api-key>", httpClient: new HttpClient(new ProxyOpenAIHandler()))
        .Build();

This is the solution I would recommend also.

Does this unblock both of your scenarios @victor-v-rad and @gingters ?

For the Chat Copilot (fka Copilot Chat and now lives here) you'd need to either modify the deployment scripts to pass your proxy through as a config value that your code picks up, or just leave as-is and hardcode your proxy. In both cases the code @JadynWong describes above should enable this.

@victor-v-rad
Copy link
Author

victor-v-rad commented Jul 26, 2023

Thank you, @JadynWong, for sharing the code snippets. That unblocks me. There is also a standard proxy configuration in the Azure SDK.
https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/core/Azure.Core/samples/Configuration.md#configuring-a-proxy
It might be good to mention this in the README.md or here in appsettings.json > AIService.
https://github.com/microsoft/chat-copilot/blob/main/webapi/appsettings.json#L22-L48

@craigomatic, thank you for sharing the new Chat Copilot repo. It's worth celebrating!
@JadynWong are you ok if I close the issue?

@JadynWong
Copy link
Contributor

JadynWong commented Jul 26, 2023

Of course, if that solves your problem.

@the-xentropy
Copy link

For anyone searching for a quick and easy way to do this in Python:

import semantic_kernel as sk
from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion

service = OpenAIChatCompletion(ai_model_id="<MODEL NAME>", api_key=<API KEY>)
service.client.base_url = "<API URL>"
kernel.add_chat_service(
    "chat_completion",
    service,
)

The underlying AsyncOpenAI client the semantic-kernel wrapper uses has the option to provide the base url in the initializer, but semantic-kernel doesn't expose it as a keyword parameter in the wrapper init function. You can still set it post-initialization by accessing the wrapper's underlying OpenAI client though.

I think it would be preferable for this keyword argument to simply be exposed in the wrapper init function, since this is prone to breakages if e.g. internal variable names change, but it works and is a 30 second one-line fix so it's good enough for me for now.

@Prakashssv-India
Copy link

For anyone searching for a quick and easy way to do this in Python:

import semantic_kernel as sk
from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion

service = OpenAIChatCompletion(ai_model_id="<MODEL NAME>", api_key=<API KEY>)
service.client.base_url = "<API URL>"
kernel.add_chat_service(
    "chat_completion",
    service,
)

The underlying AsyncOpenAI client the semantic-kernel wrapper uses has the option to provide the base url in the initializer, but semantic-kernel doesn't expose it as a keyword parameter in the wrapper init function. You can still set it post-initialization by accessing the wrapper's underlying OpenAI client though.

I think it would be preferable for this keyword argument to simply be exposed in the wrapper init function, since this is prone to breakages if e.g. internal variable names change, but it works and is a 30 second one-line fix so it's good enough for me for now.

kernel.add_chat_service(
"chat_completion",
service,
)

where is the kernel is been initialized. should i need to import any library?

@dinesh-coupa
Copy link

Can anyone help with Python code for custom openai endpoint using bearer token auth and not api_key ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

8 participants