Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Emit event if configured queue length is reached #11207

Open
ansd opened this issue May 10, 2024 · 1 comment
Open

Emit event if configured queue length is reached #11207

ansd opened this issue May 10, 2024 · 1 comment

Comments

@ansd
Copy link
Member

ansd commented May 10, 2024

Is your feature request related to a problem? Please describe.

An operator would like to receive a notification if a queue limit, e.g. a configured max-length or max-length-bytes is reached, and possibly another notification if the queue gets emptied again.

Describe the solution you'd like

The queue emits an event to rabbit_event such that these events can be consumed by the event exchange.
Events should not be emitted per message being rejected / dropped / dead-lettered to avoid excessive event creations. Instead the event should only be emitted when the queue limit is reached, and another event if the queue limit falls below some threshold (e.g. 90% of the limit).

Describe alternatives you've considered

An alternative is to dead letter messages with reason maxlen, consume from the dead letter queue and have the client app create an alert. However this solution doesn't work for overflow behaviour reject-publish.

Additional context

It's already possible today to create alerts if a queue depth reaches a specific limit, see for example

However, this issue is specifically about alerts for the configured max-length or max-length-bytes being reached.

@SimonUnge
Copy link
Member

@ansd

We have been thinking about something like this too, and would be happy to look into it. We have some more alerts we would like to get in, other limits the user nears/exceeds/leaves etc. Will create a little discussion and see what you think!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants