Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Splunk Hec Receiver - Memory Leak (Cont of #34886) #35294

Closed
brettplarson opened this issue Sep 19, 2024 · 3 comments · Fixed by #36146
Closed

Splunk Hec Receiver - Memory Leak (Cont of #34886) #35294

brettplarson opened this issue Sep 19, 2024 · 3 comments · Fixed by #36146
Assignees
Labels
bug Something isn't working receiver/splunkhec

Comments

@brettplarson
Copy link

Component(s)

receiver/splunkhec

What happened?

Description

We are still seeing an issue with the collector memory after upgrading to 0.109.0 with the fix. The behavior changed and we are now seeing more memory in the stack vs heap. Although the heap still grows slowly over time. Similar to before, removing the hec receiver from the logs pipeline gets rid of the issue. This is a test cluster where I can reproduce this sending metrics to the hec receiver.

image

image

One clue is that this is all under startlogop - the memory in startmetricsop seems normal - perhaps the way hec events are processed, as events first, is causing these to never end. forgive my speculation :)

Steps to Reproduce

send a ton of metrics to a hec endpoint and profile the memory.

Expected Result

memory should not be held in this way

Actual Result

Collector version

0.109.0

Environment information

Environment

OS: (e.g., "Ubuntu 20.04")
Compiler(if manually compiled): (e.g., "go 14.2")

OpenTelemetry Collector configuration

No response

Log output

No response

Additional context

We opened splunk case 3554107 as well.

@brettplarson brettplarson added bug Something isn't working needs triage New item requiring triage labels Sep 19, 2024
Copy link
Contributor

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@brettplarson
Copy link
Author

Just bumping this. Thank you!

@atoulme atoulme removed the needs triage New item requiring triage label Oct 2, 2024
@atoulme atoulme self-assigned this Oct 2, 2024
@atoulme
Copy link
Contributor

atoulme commented Oct 25, 2024

looking into it now.

bogdandrutu pushed a commit that referenced this issue Nov 3, 2024
<!--Ex. Fixing a bug - Describe the bug and how this fixes the issue.
Ex. Adding a feature - Explain what this achieves.-->
#### Description
Fix memory leak by changing how we run obsreports for metrics and logs.

<!-- Issue number (e.g. #1234) or full URL to issue, if applicable. -->
#### Link to tracking issue
Fixes
#35294
sbylica-splunk pushed a commit to sbylica-splunk/opentelemetry-collector-contrib that referenced this issue Dec 17, 2024
<!--Ex. Fixing a bug - Describe the bug and how this fixes the issue.
Ex. Adding a feature - Explain what this achieves.-->
#### Description
Fix memory leak by changing how we run obsreports for metrics and logs.

<!-- Issue number (e.g. open-telemetry#1234) or full URL to issue, if applicable. -->
#### Link to tracking issue
Fixes
open-telemetry#35294
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working receiver/splunkhec
Projects
None yet
2 participants