-
Notifications
You must be signed in to change notification settings - Fork 534
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hotfix to solve a backup stream limit Issue #816
Hotfix to solve a backup stream limit Issue #816
Conversation
…to considerate header size
CLA Assistant Lite bot All contributors have signed the CLA ✍️ ✅ |
Can you please Please provide a detailed description of what was done in this PR? |
I have read the CLA Document and I hereby sign the CLA |
recheck |
Sorry @vcastellm, I had a problem with my PR and lost permission to update it with my comments and description. A restore this permission yesterday, and today I provide a description for it. |
Codecov Report
@@ Coverage Diff @@
## develop #816 +/- ##
===========================================
+ Coverage 52.45% 52.46% +0.01%
===========================================
Files 132 132
Lines 17520 17520
===========================================
+ Hits 9190 9192 +2
+ Misses 7661 7659 -2
Partials 669 669
📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Description
We're trying to backup our blockchain but at some point we get the following error:
rpc error: code = ResourceExhausted desc = grpc: received message larger than max (4194312 vs. 4194304)
We did some tests and we found out that an interval of blocks,
32000 to 36981
, every single time returns this error.Looking through the code we find out that the read limit for a stream is
4MB
, but the write limit is21MB
. Checking the error message we understand that the backup process is receiving more than4MB
.We check the appending block function, it checks the size of blocks that will be written to the stream:
This code considers only the sum of block's size to be less than
4MB
. If you check the ExportEvent Protobuf struct besides the blocks we also send 3 uint64 fields (I'm calling header) in the stream.Using the golang document, we could realize that these 3 fields add to the stream payload
24 bytes (3 * 8)
(I know that polygon edge uses Protobuf and this is the maximum size possible for this 3 fields). So the previous validation may let out 24 bytes and may accept appending more blocks than the stream will support.The solution that I propose here was to add a const
maxHeaderInfoSize
with the max size of the fields (that I'm calling header) and change the validation to use itChanges include
Checklist
Testing
Manual tests
The manual test that we did, was to retry the backup process on our blockchain and see if it would finish without problems.
This solution work like charm!