Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Streaming chat for ChatOpenAI Custom #1081

Closed
yavisht opened this issue Oct 18, 2023 · 3 comments
Closed

[BUG] Streaming chat for ChatOpenAI Custom #1081

yavisht opened this issue Oct 18, 2023 · 3 comments
Labels
bug Something isn't working enhancement New feature or request

Comments

@yavisht
Copy link
Contributor

yavisht commented Oct 18, 2023

Describe the bug
Streaming chat does not work on ChatOpenAI Custom

To Reproduce
image

Expected behavior
The messages are not streaming back from the chat widget, it just loads the whole message.

Screenshots

Screen.Recording.2023-10-18.at.12.27.44.pm.mov

Flow
See screenshot

Additional context
If I change it to ChatOpenAI, message are streaming back.

@HenryHengZJ HenryHengZJ added enhancement New feature or request bug Something isn't working labels Dec 7, 2023
@billsecond
Copy link

Same here.

@lswencn
Copy link

lswencn commented Aug 16, 2024

I also encountered this issue. How can it be resolved?

@Wowso
Copy link
Contributor

Wowso commented Sep 10, 2024

There is a workaround.

After the prediction API is executed, the Flowise server checks whether the Chat Model supports streaming through the isFlowValidForStream function.

The chatOpenAICustom model is missing from the streamAvailableLLMs variable.

By temporarily adding it, streaming can be enabled.

I'm not sure why only the chatOpenAICustom model is excluded.

export const isFlowValidForStream = (reactFlowNodes: IReactFlowNode[], endingNodeData: INodeData) => {

const streamAvailableLLMs = {
        'Chat Models': [
            'azureChatOpenAI',
            'chatOpenAI',
            'chatOpenAI_LlamaIndex',
            'chatOpenAICustom', // Add
            'chatAnthropic',
            'chatAnthropic_LlamaIndex',
            'chatOllama',
            'chatOllama_LlamaIndex',
            'awsChatBedrock',
            'chatMistralAI',
            'chatMistral_LlamaIndex',
            'groqChat',
            'chatGroq_LlamaIndex',
            'chatCohere',
            'chatGoogleGenerativeAI',
            'chatTogetherAI',
            'chatTogetherAI_LlamaIndex',
            'chatFireworks',
            'chatBaiduWenxin'
        ],
        LLMs: ['azureOpenAI', 'openAI', 'ollama']
    }

Wowso pushed a commit to Wowso/Flowise that referenced this issue Sep 12, 2024
* Fixed issue preventing chatOpenAICustom model from streaming
* Updated streamAvailableLLMs to include chatOpenAICustom model
HenryHengZJ pushed a commit that referenced this issue Sep 14, 2024
Bugfix/chatOpenAICustom model not streaming (#1081)
* Fixed issue preventing chatOpenAICustom model from streaming
* Updated streamAvailableLLMs to include chatOpenAICustom model

Co-authored-by: AnJinSu <anjinsu96@handysoft.co.kr>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

5 participants