-
Notifications
You must be signed in to change notification settings - Fork 596
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
imc-dispatcher dial tcp <IP>: connect: cannot assign requested address #4461
Comments
/assign |
related? cloudevents/sdk-go#98 |
looks like by default, the |
Yeah that's what i was suspecting, my feeling is that we set this parameter too high |
So the actual limit config is:
Which means it can open a maximum of 1000 sockets, i guess not enough to leak all sockets on the machine... Let me try to figure out if we always use the same http client, or if somewhere we create new ones |
Now every new message sender always reuse the same underlying client, whenever possible Signed-off-by: Francesco Guardiani <francescoguard@gmail.com>
* Fix #4461 Now every new message sender always reuse the same underlying client, whenever possible Signed-off-by: Francesco Guardiani <francescoguard@gmail.com> * Increase coverage Signed-off-by: Francesco Guardiani <francescoguard@gmail.com> * Brought back the previous method to avoid breakage Signed-off-by: Francesco Guardiani <francescoguard@gmail.com> * Now we use nice language Signed-off-by: Francesco Guardiani <francescoguard@gmail.com> * Removed useless test Signed-off-by: Francesco Guardiani <francescoguard@gmail.com> * Suggestions Signed-off-by: Francesco Guardiani <francescoguard@gmail.com> * Imports job Signed-off-by: Francesco Guardiani <francescoguard@gmail.com> * nit Signed-off-by: Francesco Guardiani <francescoguard@gmail.com> * Fancy ut Signed-off-by: Francesco Guardiani <francescoguard@gmail.com> * Copyright Signed-off-by: Francesco Guardiani <francescoguard@gmail.com>
Describe the bug
After about a day of cluster runtime, deploying brokers backed by InMemoryChannels (at the time of the errors there are 24 brokers on the cluster, each with a single trigger), sending events with replies, I have noticed the following errors present in imc-dispatcher logs:
"dial tcp : connect: cannot assign requested address"
in errors like these:
{"level":"error","ts":"2020-11-04T18:51:54.205Z","logger":"inmemorychannel-dispatcher","caller":"fanout/fanout_message_handler.go:189","msg":"Fanout had an error","error":"failed to forward reply to http://broker-ingress.knative-eventing.svc.cluster.local/foo10/broker: Post \"http://broker-ingress.knative-eventing.svc.cluster.local/foo10/broker\": dial tcp 172.30.107.69:80: connect: cannot assign requested address","stacktrace":"knative.dev/eventing/pkg/channel/fanout.(*MessageHandler).dispatch\n\t/opt/app-root/src/go/src/knative.dev/eventing/pkg/channel/fanout/fanout_message_handler.go:189\nknative.dev/eventing/pkg/channel/fanout.createMessageReceiverFunction.func1.1\n\t/opt/app-root/src/go/src/knative.dev/eventing/pkg/channel/fanout/fanout_message_handler.go:143"}
{"level":"error","ts":"2020-11-04T18:52:02.334Z","logger":"inmemorychannel-dispatcher","caller":"fanout/fanout_message_handler.go:189","msg":"Fanout had an error","error":"unable to complete request to http://broker-filter.knative-eventing.svc.cluster.local/triggers/foo6/counter/6c5ab867-5345-449a-aef7-7023693bf821: Post \"http://broker-filter.knative-eventing.svc.cluster.local/triggers/foo6/counter/6c5ab867-5345-449a-aef7-7023693bf821\": dial tcp 172.30.166.136:80: connect: cannot assign requested address","stacktrace":"knative.dev/eventing/pkg/channel/fanout.(*MessageHandler).dispatch\n\t/opt/app-root/src/go/src/knative.dev/eventing/pkg/channel/fanout/fanout_message_handler.go:189\nknative.dev/eventing/pkg/channel/fanout.createMessageReceiverFunction.func1.1\n\t/opt/app-root/src/go/src/knative.dev/eventing/pkg/channel/fanout/fanout_message_handler.go:143"}
suggesting there may be sockets leaking
Expected behavior
"cannot assign requested address" errors should not appear
To Reproduce
Currently unknown
Knative release version
0.17.2
Additional context
The text was updated successfully, but these errors were encountered: