Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add local models to non-streaming accept list #14420

Merged
merged 2 commits into from
Nov 25, 2024

Conversation

MatthewKhouzam
Copy link
Contributor

@MatthewKhouzam MatthewKhouzam commented Nov 8, 2024

What it does

Allows local model orchestrators like GPT4All to be a back-end for Theia's AI features by allowing to configure disableStreaming for them

Starts to address issues/14413

How to test

  • Configure a custom Open AI model
  • Add disableStreaming: true
  • Observe (or debug) that the non-streaming API is used

Example Open AI configuration (using the official Open AI endpoint):

        {
            "model": "gpt-4o",
            "id": "my-custom-gpt-4o",
            "url": "https://api.openai.com/v1",
            "apiKey": true,
            "disableStreaming": true
        }

Follow-ups

We need to set up a max_token as different orchestrators stop at different lengths.

Review checklist

Reminder for reviewers

Copy link
Member

@sdirix sdirix left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for this hotfix.

In general we should make this configurable per (custom) model with an additional attribute.

@JonasHelming
Copy link
Contributor

It is not only "custom" models, o1-preview also does not support streaming at the moment.
In general it probably makes sense to allows custom parameters per model on a global level.

@sdirix
Copy link
Member

sdirix commented Nov 11, 2024

Middleground suggestion to get this in quickly:

  • For now we can manually keep the hard coded non-streaming list of Open AI models as we maintain them manually anyway. This keeps the nicer UI for Open AI models.
  • For all custom models we add a disableStreaming attribute which is false by default

Custom OpenAI models can now be configured with
'disableStreaming: true' to indicate that streaming shall not be used.
This is especially useful for models which do not support streaming at
all.

Co-authored-by: Matthew Khouzam <matthew.khouzam@ericsson.com>
@sdirix
Copy link
Member

sdirix commented Nov 12, 2024

Adapted the PR. @MatthewKhouzam can you check whether this works for you?

@sdirix
Copy link
Member

sdirix commented Nov 15, 2024

@MatthewKhouzam Did you have a chance to look the changes? Can we merge?

Copy link

@TheMatthew TheMatthew left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am OK with this... but should it be enableStreaming... I'm imagining tech support saying "yeah, go ahead and enable disablestreaming." and people having a hard time.

@sdirix
Copy link
Member

sdirix commented Nov 18, 2024

It's a bit more errorprone when a boolean is true by default as a simple Javascript check like if(foo.bar) will be falsy in case the value does not exist. However that should likely not dictate our user interface. So I'm fine with switching the naming and default around, i.e. enableStreaming which is by default true.

@MatthewKhouzam
Copy link
Contributor Author

I want to be clear, I approved on my side, anything else left to do?

@sdirix sdirix merged commit 09dfb23 into eclipse-theia:master Nov 25, 2024
11 checks passed
@github-actions github-actions bot added this to the 1.56.0 milestone Nov 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Archived in project
Development

Successfully merging this pull request may close these issues.

4 participants