Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New client option DisableLogStderr to disable expensive log stderr from subprocess #316

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

magodo
Copy link

@magodo magodo commented Aug 4, 2024

I have a light weighted go-plugin client that talks to a terraform provider, mainly to import and read one or more terraform resources from the provider.

I expect this is anI/O bound task, instead of CPU intensive, as the client&provider will be waiting for the cloud platform API call in most of the time. However, the result turns out the CPU is also non-negligible.

The provider I'm talking to is the terraform-provider-azurerm, whose on initialization is a bit wordy that will print a bunch of logs. By doing some benchmark and profiling. I noticed that the go routine for logging stderr from the subprocess is consuming about 20% of the CPU time:

image

With this PR to disable the log stderr, especially the expensive parseJSON, we can get 20% performance improvement:

image

Ideally, we shall check whether the logger is a hclog.NullLogger and the config.Stderr is io.Discard, instead of introducing this new property. While I don't know if there is a way to do the first check (as the current default behaivor for null config.Logger is to set a logger to set a non-noop logger.

Also note that when this DisableLogStderr won't stop spawning the go routine at all since we still need to check whether the stderr hit EOF to tell whether the subprocess has ended.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant