This repositorty contains Delphi implementation over OpenAI public API.
❗This is an unofficial library. OpenAI does not provide any official library for Delphi.
Coverage
API | Status |
---|---|
Models | 🟢 Done |
Completions (Legacy) | 🟢 Done |
Chat | 🟢 Done |
Chat Vision | 🟢 Done |
Edits | 🟢 Done |
Images | 🟢 Done |
Embeddings | 🟢 Done |
Audio | 🟢 Done |
Files | 🟢 Done |
Fine-tunes (Depricated) | 🟢 Done |
Fine-tuning | 🟢 Done |
Moderations | 🟢 Done |
Engines (Depricated) | 🟢 Done |
Assistants | 🟠 In progress |
Threads | 🟠 In progress |
Messages | 🟠 In progress |
Runs | 🟠 In progress |
OpenAI is a non-profit artificial intelligence research organization founded in San Francisco, California in 2015. It was created with the purpose of advancing digital intelligence in ways that benefit humanity as a whole and promote societal progress. The organization strives to develop AI (Artificial Intelligence) programs and systems that can think, act and adapt quickly on their own – autonomously. OpenAI's mission is to ensure safe and responsible use of AI for civic good, economic growth and other public benefits; this includes cutting-edge research into important topics such as general AI safety, natural language processing, applied reinforcement learning methods, machine vision algorithms etc.
The OpenAI API can be applied to virtually any task that involves understanding or generating natural language or code. We offer a spectrum of models with different levels of power suitable for different tasks, as well as the ability to fine-tune your own custom models. These models can be used for everything from content generation to semantic search and classification.
This library provides access to the API of the OpenAI service, on the basis of which ChatGPT works and, for example, the generation of images from text using DALL-E
.
You can install the package from GetIt
directly in the IDE. Or, to use the library, just add the root
folder to the IDE library path, or your project source path.
To initialize API instance you need to obtain API token from your Open AI organization.
Once you have a token, you can initialize TOpenAI
class, which is an entry point to the API.
Due to the fact that there can be many parameters and not all of them are required, they are configured using an anonymous function.
uses OpenAI;
var OpenAI := TOpenAIComponent.Create(Self, API_TOKEN);
or
uses OpenAI;
var OpenAI: IOpenAI := TOpenAI.Create(API_TOKEN);
Once token you posses the token, and the instance is initialized you are ready to make requests.
List and describe the various models available in the API. You can refer to the Models documentation to understand what models are available and the differences between them.
var Models := OpenAI.Model.List();
try
for var Model in Models.Data do
MemoChat.Lines.Add(Model.Id);
finally
Models.Free;
end;
Review Models Documentation for more info.
Given a prompt, the model will return one or more predicted completions, and can also return the probabilities of alternative tokens at each position.
var Completions := OpenAI.Completion.Create(
procedure(Params: TCompletionParams)
begin
Params.Prompt(MemoPrompt.Text);
Params.MaxTokens(2048);
end);
try
for var Choice in Completions.Choices do
MemoChat.Lines.Add(Choice.Index.ToString + ' ' + Choice.Text);
finally
Completions.Free;
end;
Review Completions Documentation for more info.
Given a chat conversation, the model will return a chat completion response. ChatGPT is powered by gpt-3.5-turbo, OpenAI’s most advanced language model.
Using the OpenAI API, you can build your own applications with gpt-3.5-turbo to do things like:
- Draft an email or other piece of writing
- Write Python code
- Answer questions about a set of documents
- Create conversational agents
- Give your software a natural language interface
- Tutor in a range of subjects
- Translate languages
- Simulate characters for video games and much more
This guide explains how to make an API call for chat-based language models and shares tips for getting good results.
var Chat := OpenAI.Chat.Create(
procedure(Params: TChatParams)
begin
Params.Messages([TChatMessageBuild.Create(TMessageRole.User, Text)]);
Params.MaxTokens(1024);
end);
try
for var Choice in Chat.Choices do
MemoChat.Lines.Add(Choice.Message.Content);
finally
Chat.Free;
end;
OpenAI.Chat.CreateStream(
procedure(Params: TChatParams)
begin
Params.Messages([TchatMessageBuild.User(Buf.Text)]);
Params.MaxTokens(1024);
Params.Stream;
end,
procedure(Chat: TChat; IsDone: Boolean; var Cancel: Boolean)
begin
if (not IsDone) and Assigned(Chat) then
Writeln(Chat.Choices[0].Delta.Content)
else if IsDone then
Writeln('DONE!');
Writeln('-------');
Sleep(100);
end);
var Chat := OpenAI.Chat.Create(
procedure(Params: TChatParams)
begin
Params.Model('gpt-4-vision-preview');
var Content: TArray<TMessageContent>;
Content := Content + [TMessageContent.CreateText(Text)];
Content := Content + [TMessageContent.CreateImage(FileToBase64('file path'))];
Params.Messages([TChatMessageBuild.User(Content)]);
Params.MaxTokens(1024);
end);
try
for var Choice in Chat.Choices do
MemoChat.Lines.Add(Choice.Message.Content);
finally
Chat.Free;
end;
Review Chat Documentation for more info.
Given a prompt and/or an input image, the model will generate a new image.
var Images := OpenAI.Image.Create(
procedure(Params: TImageCreateParams)
begin
Params.Prompt(MemoPrompt.Text);
Params.ResponseFormat('url');
end);
try
for var Image in Images.Data do
Image1.Bitmap.LoadFromUrl(Image.Url);
finally
Images.Free;
end;
Review Images Documentation for more info.
In an API call, you can describe functions to gpt-3.5-turbo-0613 and gpt-4-0613, and have the model intelligently choose to output a JSON object containing arguments to call those functions. The Chat Completions API does not call the function; instead, the model generates JSON that you can use to call the function in your code.
The latest models (gpt-3.5-turbo-0613 and gpt-4-0613) have been fine-tuned to both detect when a function should to be called (depending on the input) and to respond with JSON that adheres to the function signature. With this capability also comes potential risks. We strongly recommend building in user confirmation flows before taking actions that impact the world on behalf of users (sending an email, posting something online, making a purchase, etc).
var Chat := OpenAI.Chat.Create(
procedure(Params: TChatParams)
begin
Params.Functions(Funcs); //list of functions (TArray<IChatFunction>)
Params.FunctionCall(TFunctionCall.Auto);
Params.Messages([TChatMessageBuild.User(Text)]);
Params.MaxTokens(1024);
end);
try
for var Choice in Chat.Choices do
if Choice.FinishReason = TFinishReason.FunctionCall then
ProcFunction(Choice.Message.FunctionCall) // execute function (send result to chat, and continue)
else
MemoChat.Lines.Add(Choice.Message.Content);
finally
Chat.Free;
end;
...
procedure ProcFunction(Func: TChatFunctionCall);
begin
var FuncResult := Execute(Func.Name, Func.Arguments); //execute function and get result (json)
var Chat := OpenAI.Chat.Create(
procedure(Params: TChatParams)
begin
Params.Functions(Funcs); //list of functions (TArray<IChatFunction>)
Params.FunctionCall(TFunctionCall.Auto);
Params.Messages([ //need all history
TChatMessageBuild.User(Text),
TChatMessageBuild.NewAsistantFunc(Func.Name, Func.Arguments),
TChatMessageBuild.Func(FuncResult, Func.Name)]);
Params.MaxTokens(1024);
end);
try
for var Choice in Chat.Choices do
MemoChat.Lines.Add(Choice.Message.Content);
finally
Chat.Free;
end;
end;
Review Functions Documentation for more info.
try
var Images := OpenAI.Image.Create(...);
except
on E: OpenAIExceptionRateLimitError do
ShowError('OpenAI Limit Error: ' + E.Message);
on E: OpenAIException do
ShowError('OpenAI Error: ' + E.Message);
end;
- OpenAIExceptionAPI - errors of wrapper
- OpenAIException - base exception
- OpenAIExceptionInvalidRequestError
- OpenAIExceptionRateLimitError
- OpenAIExceptionAuthenticationError
- OpenAIExceptionPermissionError
- OpenAIExceptionTryAgain
- OpenAIExceptionInvalidResponse - parse error
OpenAI.API.Client.ProxySettings := TProxySettings.Create(ProxyHost, ProxyPort, ProxyUserName, ProxyPassword);
This library does not require any 3rd party library. It works on recent Delphi versions (10.3+). Althought not fully tested, it should also work on all supported platforms (Windows, Linux, macOS, Android, iOS).
Since the library requires your secret API key, it's not recommended you use it on client applications, as your secret key will be exposed, unless you are sure about the security risks.
MIT License
Copyright (c) 2023 HemulGM
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.