Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: fix log producer stopping block indefinitely #1783

Closed
wants to merge 1 commit into from

Conversation

lefinal
Copy link
Contributor

@lefinal lefinal commented Oct 19, 2023

Fixes an issue where stopping the log producer would block indefinitely if the log producer has exited before.

What does this PR do?

Fixes stopping the log producer blocking by not requiring a read on the stopProducer channel but simply closing it.
There is no synchronization for the log producer to finish running anyways, so this should be fine.

Why is it important?

If the log producer exits due to an exited container, StopLogProducer will hang indefinitely.

Related issues

Closes #1407
Closes #1669

How to test this PR

Test has been added.

Follow-ups

stopProducer is currently not safe for concurrent use. Maybe add some synchronization in the future.

@lefinal lefinal requested a review from a team as a code owner October 19, 2023 06:28
@netlify
Copy link

netlify bot commented Oct 19, 2023

Deploy Preview for testcontainers-go ready!

Name Link
🔨 Latest commit ca04965
🔍 Latest deploy log https://app.netlify.com/sites/testcontainers-go/deploys/654abe763e9b53000818d968
😎 Deploy Preview https://deploy-preview-1783--testcontainers-go.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site configuration.

@mdelapenya
Copy link
Member

Hey @lefinal thanks for this! Do you think it is related to #1777?

@lefinal
Copy link
Contributor Author

lefinal commented Oct 19, 2023

Hey @lefinal thanks for this! Do you think it is related to #1777?

The mentioned issue regarding panics is not the same. However, the changes from #1777 will accomplish the same thing. The author just uses context.Context and context.CancelFunc instead of closing the channel. Both approaches will avoid StopLogProducer hanging.

@mdelapenya
Copy link
Member

Hey @lefinal thanks for this! Do you think it is related to #1777?

The mentioned issue regarding panics is not the same. However, the changes from #1777 will accomplish the same thing. The author just uses context.Context and context.CancelFunc instead of closing the channel. Both approaches will avoid StopLogProducer hanging.

So, if you mind 🙏 , once we merge that one (already reviewed, tested and commented by the community), we can close this one.

@lefinal
Copy link
Contributor Author

lefinal commented Oct 19, 2023

Hey @lefinal thanks for this! Do you think it is related to #1777?

The mentioned issue regarding panics is not the same. However, the changes from #1777 will accomplish the same thing. The author just uses context.Context and context.CancelFunc instead of closing the channel. Both approaches will avoid StopLogProducer hanging.

So, if you mind 🙏 , once we merge that one (already reviewed, tested and commented by the community), we can close this one.

Absolutely! Only wanted to provide a quick fix in case #1777 takes longer to review and merge :)

@mdelapenya
Copy link
Member

Only wanted to provide a quick fix in case #1777 takes longer to review and merge

Would you mind resolving the conflicts if possible? 🙏 I think this fix if easier to merge that #1777

Fixes an issue where stopping the log producer would block indefinitely if the log producer has exited before.

Closes testcontainers#1407, testcontainers#1669
@lefinal
Copy link
Contributor Author

lefinal commented Nov 7, 2023

Only wanted to provide a quick fix in case #1777 takes longer to review and merge

Would you mind resolving the conflicts if possible? 🙏 I think this fix if easier to merge that #1777

Done :)

iameli added a commit to livepeer/catalyst that referenced this pull request Nov 27, 2023
We're currently hitting this issue testcontainers/testcontainers-go#1783
and the forked version fixes it. When that fix is upstreamed we may move
back to an official build.
iameli added a commit to livepeer/catalyst that referenced this pull request Nov 27, 2023
* [AUTO-COMMIT] Update `manifest.yaml`

Files changed:
M	manifest.yaml

* video: add recording of video+ prefixes

* Makefile: remove old docker-local dependency

* Dockerfile: download catalyst with curl

* ci: try making box

* ci: no frontend in tests

* go.mod: use upgraded, forked testcontainers-go

We're currently hitting this issue testcontainers/testcontainers-go#1783
and the forked version fixes it. When that fix is upstreamed we may move
back to an official build.

* ci: disable ghcr.io repos

they're only there as a backup and they result in this alllll the time

ERROR: failed to solve: failed to compute cache key: failed to copy: read tcp 172.17.0.2:44614->20.60.63.1:443: read: connection timed out
Error: buildx failed with: ERROR: failed to solve: failed to compute cache key: failed to copy: read tcp 172.17.0.2:44614->20.60.63.1:443: read: connection timed out

* Dockerfile: bypass cloudflare for catalyst pull

---------

Co-authored-by: livepeer-docker <livepeer-docker@users.noreply.github.com>
@jrfeenst
Copy link

Any updates on this?

@mdelapenya
Copy link
Member

I think #1971 could be related to this

@mdelapenya
Copy link
Member

@lefinal is #1971 superseding this one? Could you please verify that? 🙏

@mdelapenya
Copy link
Member

Hey @lefinal did you check #1783 (comment)?

Please let me know if I can close this one 🙏

Cheers!

@mdelapenya
Copy link
Member

I'm going to close this one, as it has been superseded by the current state of the main branch and by #2576

Thanks!

@mdelapenya mdelapenya closed this Jun 12, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
3 participants