You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the case of the non-streaming response, would a long running langchain chain run until completion? Say a 20 minute chain? I am aware usage is based on CPU Time and not wall time, but curious if there's a timeout at all. Th
is would be a differentiating factor from others, for example, supabase edge functions iirc have a wall timeout of 1 minute, but is configurable if self hosting.
If Vercel is offering edge functions with CPU Time pricing and no limit on wall clock time, then this would be KILLER for openai use cases even more than people think, since it would be perfect for such long chains that don't require streaming - aka post processing using langchain chains and only paying for CPU time.
In the case of the non-streaming response, would a long running langchain chain run until completion? Say a 20 minute chain? I am aware usage is based on CPU Time and not wall time, but curious if there's a timeout at all. Th
is would be a differentiating factor from others, for example, supabase edge functions iirc have a wall timeout of 1 minute, but is configurable if self hosting.
If Vercel is offering edge functions with CPU Time pricing and no limit on wall clock time, then this would be KILLER for openai use cases even more than people think, since it would be perfect for such long chains that don't require streaming - aka post processing using langchain chains and only paying for CPU time.
@jaredpalmer
The text was updated successfully, but these errors were encountered: