Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

always close cache warm chan to prevent blocking #14080

Merged
merged 4 commits into from
Jun 4, 2024
Merged

Conversation

kasey
Copy link
Contributor

@kasey kasey commented Jun 4, 2024

What type of PR is this?

Bug fix

What does this PR do? Why is it needed?

5.0.4-rc.1 has a bug where invalid blobs in blob storage can cause the cache warmup code to fail to close the channel which other users of the cache wait on to indicate the cache is ready. This results in initial-sync (which does not use a timeout when requesting the cache) to hang indefinitely if there are truncated blobs in storage.

This PR fixes the issue by ensuring the sentinel channel is always closed, even if the cache warmup call to the pruning func produces an error.

@kasey kasey requested a review from a team as a code owner June 4, 2024 21:03
@@ -93,13 +93,15 @@ func windowMin(latest, offset primitives.Slot) primitives.Slot {
func (p *blobPruner) warmCache() error {
p.Lock()
defer p.Unlock()
defer func() {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

does the order of the defer above for unlock matter? I suppose if it unlocks here this other defer is updating p.warmed without the lock?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

defers are always LIFO, so this code is correct (unlock happens after the channel is closed). But, I did anticipate this might confuse someone and thought about putting the unlock in the other deferred function, so I think I'll go ahead and do that.

prestonvanloon
prestonvanloon previously approved these changes Jun 4, 2024
james-prysm
james-prysm previously approved these changes Jun 4, 2024
@kasey kasey dismissed stale reviews from james-prysm and prestonvanloon via 8a75887 June 4, 2024 21:54
@kasey kasey added this pull request to the merge queue Jun 4, 2024
Merged via the queue into develop with commit ea2624b Jun 4, 2024
17 checks passed
@kasey kasey deleted the fix-cache-wait-hang branch June 4, 2024 22:15
prestonvanloon pushed a commit that referenced this pull request Jun 5, 2024
* always close cache warm chan to prevent blocking

* test that waitForCache does not block

* combine defers to reduce cognitive overhead

* lint

---------

Co-authored-by: Kasey Kirkham <kasey@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants