Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[processor/k8sattributes] does not immediately enhance logs with k8s.deployment.name #29305

Closed
libormezl opened this issue Nov 14, 2023 · 4 comments
Labels
bug Something isn't working closed as inactive priority:p2 Medium processor/k8sattributes k8s Attributes processor Stale

Comments

@libormezl
Copy link

Description

I am attempting to enhance logs from the OLTP receiver with metadata using the k8sattributes processor. After the collector restarts, I can see every metadata defined in k8sattributes Helm preset, except for k8s.deployment.name. However, after a few minutes (4 - 10), I observe that all the defined metadata, including k8s.deployment.name, is present.

I am deploying through the Terraform Helm provider in daemonset mode. To activate the processor, I am using the default preset configuration.

Component(s)

processor/k8sattributes

Helm version

0.73.0

Expected Result

To see k8s.deployment.name immediately after logging starts with other k8s.deployment.name metadata.

@TylerHelmuth TylerHelmuth transferred this issue from open-telemetry/opentelemetry-helm-charts Nov 16, 2023
@TylerHelmuth TylerHelmuth changed the title k8sattributes processor does not immediately enhance logs with k8s.deployment.name [processor/k8sattributes] does not immediately enhance logs with k8s.deployment.name Nov 16, 2023
@TylerHelmuth TylerHelmuth added bug Something isn't working priority:p2 Medium processor/k8sattributes k8s Attributes processor labels Nov 16, 2023
@trexx
Copy link

trexx commented Dec 4, 2023

I also see this behaviour with 0.90.1.

Copy link
Contributor

github-actions bot commented Feb 5, 2024

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Feb 5, 2024
Copy link
Contributor

github-actions bot commented Apr 5, 2024

This issue has been closed as inactive because it has been stale for 120 days with no activity.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Apr 5, 2024
@sebastianhaeni
Copy link

Has this problem been discussed anywhere else? We run into with metrics as well. When the collector restarts, the workload labels are missing (i.e. k8s.deployment.name, k8s.statefulset.name, ...). Missing labels in metrics means the timeseries identity changes and sums of data points contain big anomalies, leading to weird looking graphs and false positives in alerting (we map some k8s resource attributes to data point attributes, which might be something that not many do?)

I guess this is because it's not yet populated in the local in-memory cache. I also don't know if this is maybe an issue that's not so easy to solve, as the processor maybe does not want to "hold" signals until the k8s API answered.

Nonetheless, it would be nice to have a statement if this is a known limitation, or a bug that is going to be fixed. As of now, processor/k8sattributes for workload attributes mapped into datapoint attributes is unusable.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working closed as inactive priority:p2 Medium processor/k8sattributes k8s Attributes processor Stale
Projects
None yet
Development

No branches or pull requests

4 participants