Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bound TXN queue lengths #16

Merged
merged 5 commits into from
Jul 5, 2017

Conversation

domodwyer
Copy link

See go-mgo#463.

Provides an opt-in failsafe upper bound to the length of the transaction queue. More thanks to @jameinel again for the hard work.

jameinel and others added 4 commits July 4, 2017 12:54
When we have broken transaction data in the database (such as from
mongo getting OOM killed), it can cause cascade failure, where that
document ends up getting too many transactions queued up against it.

This can also happen if you have nothing but assert-only transactions against a
single document.

If we have lots of transactions, it becomes harder and harder to add
new entries and clearing out a large queue is O(N^2) which means capping it
is worthwhile. (It also makes the document grow until it hits max-doc-size.)

The upper bound is still quite large, so it should not be triggered if
everything is operating normally.
Still defaults to 1000 without any other configuration, but
allows callers to know that they can be stricter/less strict.
@domodwyer domodwyer changed the title Bugfix/jameinel max txn queue length Bound TXN queue lengths Jul 5, 2017
@domodwyer domodwyer merged commit 73a9463 into development Jul 5, 2017
@domodwyer domodwyer deleted the bugfix/jameinel-max-txn-queue-length branch July 5, 2017 13:30
@domodwyer domodwyer mentioned this pull request Jul 26, 2017
libi pushed a commit to libi/mgo that referenced this pull request Dec 1, 2022
…txn-queue-length

Bound TXN queue lengths
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants