This Cloudflare worker serves as a proxy to the OpenAI API. I created it to address two issues:
- Some countries and territories are still not supported by the OpenAI API.
- There are many ChatBot clients that request the API key, even though OpenAI discourages sharing it.
By accessing the OpenAI API through the Cloudflare worker, clients from all countries can access the API. The access key mechanism also allows you to keep the real API key private.
To set up secrets for the Cloudflare worker, follow these steps:
- Run
wrangler secret put OPENAI_API_KEY
to set the OpenAI API key. - Run
wrangler secret put ACCESS_KEYS
to set the access key.
The ACCESS_KEYS is a comma separated list of access keys. Each access key is a string of "sk-{user_name}-{random_string}", for example "sk-Tom-2Hf3aTUVlG". You could run node src/key.mjs user_name
to generate the access keys.
❯ node src/key.mjs Tom
sk-Tom-wMtF9kkGDu
Run wrangler deploy
to deploy the worker to Cloudflare.
Now you can use the URL of this worker as the base URL of the OpenAI API, and the access key as the OpenAI API key to access the OpenAI API.
import openai
openai.api_key = "sk-Tom-wMtF9kkGDu"
openai.api_base = "https://openai-proxy.yourname-8235.workers.dev"
# create a chat completion
chat_completion = openai.ChatCompletion.create(
model="gpt-3.5-turbo", messages=[{"role": "user", "content": "Hello world"}]
)
# print the chat completion
print(chat_completion.choices[0].message.content)