-
-
Notifications
You must be signed in to change notification settings - Fork 16.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Streaming chat for ChatOpenAI Custom #1081
Comments
Same here. |
I also encountered this issue. How can it be resolved? |
There is a workaround. After the prediction API is executed, the Flowise server checks whether the Chat Model supports streaming through the isFlowValidForStream function. The chatOpenAICustom model is missing from the streamAvailableLLMs variable. By temporarily adding it, streaming can be enabled. I'm not sure why only the chatOpenAICustom model is excluded. Flowise/packages/server/src/utils/index.ts Line 1188 in 56f9208
const streamAvailableLLMs = {
'Chat Models': [
'azureChatOpenAI',
'chatOpenAI',
'chatOpenAI_LlamaIndex',
'chatOpenAICustom', // Add
'chatAnthropic',
'chatAnthropic_LlamaIndex',
'chatOllama',
'chatOllama_LlamaIndex',
'awsChatBedrock',
'chatMistralAI',
'chatMistral_LlamaIndex',
'groqChat',
'chatGroq_LlamaIndex',
'chatCohere',
'chatGoogleGenerativeAI',
'chatTogetherAI',
'chatTogetherAI_LlamaIndex',
'chatFireworks',
'chatBaiduWenxin'
],
LLMs: ['azureOpenAI', 'openAI', 'ollama']
} |
* Fixed issue preventing chatOpenAICustom model from streaming * Updated streamAvailableLLMs to include chatOpenAICustom model
Bugfix/chatOpenAICustom model not streaming (#1081) * Fixed issue preventing chatOpenAICustom model from streaming * Updated streamAvailableLLMs to include chatOpenAICustom model Co-authored-by: AnJinSu <anjinsu96@handysoft.co.kr>
Describe the bug
Streaming chat does not work on ChatOpenAI Custom
To Reproduce
Expected behavior
The messages are not streaming back from the chat widget, it just loads the whole message.
Screenshots
Screen.Recording.2023-10-18.at.12.27.44.pm.mov
Flow
See screenshot
Additional context
If I change it to ChatOpenAI, message are streaming back.
The text was updated successfully, but these errors were encountered: