-
We've a Flask LangChain based app using a locally deployed LangFuse pod and normally everything works just great. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
Hey there, @sternr! I'm here to assist you with any bugs, questions, or contributions. Let's tackle this issue together! I found a similar closed issue that might be relevant to your situation: bug: ERROR:langfuse:API error occurred:. The suggested solution in that case was to upgrade Langchain and switch to the recommended For handling high concurrency and buffering in Langfuse, you can adjust several configurations in the client library:
Adjusting these settings can help optimize your application for high concurrency and improve its throughput and latency. |
Beta Was this translation helpful? Give feedback.
-
This is awesome, increasing threads and buffer really helped! |
Beta Was this translation helpful? Give feedback.
Hey there, @sternr! I'm here to assist you with any bugs, questions, or contributions. Let's tackle this issue together!
I found a similar closed issue that might be relevant to your situation: bug: ERROR:langfuse:API error occurred:. The suggested solution in that case was to upgrade Langchain and switch to the recommended
from langchain_openai import AzureChatOpenAI
[1].For handling high concurrency and buffering in Langfuse, you can adjust several configurations in the client library:
Concurrency and Buffering Settings:
LANGFUSE_THREADS
: Increase the number of consumer threads to handle more concurrent requests. The default is 1, but you can increase it if you encounter scaling is…