Replies: 1 comment 2 replies
-
I am a bit confused about what the OS has to do with this; are you noticing it creating more sockets than you'd expect? To verify whether that is true, it'd be good if you included a reproducible example - I suspect that there may be a misunderstanding of HTTP/2. If your goal is to use fewer connections and by doing so increase RPS then HTTP/2 is the right choice -- but so is async. The async API is all about maximizing performance of one thread: when one task is waiting for a response you can have another one sending another request or even simultaneously waiting for another response on the socket (at least in theory of how HTTP/2 works; I don't know enough about HTTPX's internals to say whether that is allowed). I disagree about the async API being hard to debug in Python, in fact the opposite is true because of cooperative scheduling. There's much less race conditions possible in your application when using async. Threading is notoriously difficult to get right: multithreading.mp4 |
Beta Was this translation helpful? Give feedback.
-
My goal is to use fewer connections but getting more requests per second (or maximizing requests speed per connections).
I tried using normal client to send streaming requests, and then getting the respond in other threads, but looks like the OS dose not like this.
Just don't want to use async api because is it quite hard to debug in python.
Beta Was this translation helpful? Give feedback.
All reactions