Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: update OpenTelemetry Collector to 0.94.0-sumo-2 #3576

Closed
wants to merge 3 commits into from

Conversation

kasia-kujawa
Copy link
Contributor

chore: update OpenTelemetry Collector to 0.94.0-sumo-2

Checklist

  • Changelog updated or skip changelog label added
  • Documentation updated
  • Template tests added for new features
  • Integration tests added or modified for major features

Base automatically changed from kkujawa-remove-clear-log-timestamp to main February 23, 2024 14:26
@kasia-kujawa kasia-kujawa force-pushed the kkujawa-update-otelcol branch from 70557c6 to 9ad9183 Compare February 23, 2024 14:56
@kasia-kujawa kasia-kujawa marked this pull request as ready for review February 23, 2024 14:57
@kasia-kujawa kasia-kujawa requested a review from a team as a code owner February 23, 2024 14:57
@kasia-kujawa kasia-kujawa force-pushed the kkujawa-update-otelcol branch from 9ad9183 to fab9332 Compare February 27, 2024 14:52
@kasia-kujawa kasia-kujawa marked this pull request as draft March 5, 2024 10:37
@kasia-kujawa kasia-kujawa force-pushed the kkujawa-update-otelcol branch from fab9332 to 802a1be Compare March 7, 2024 11:42
@kasia-kujawa kasia-kujawa force-pushed the kkujawa-update-otelcol branch from 802a1be to 5345718 Compare March 7, 2024 11:56
@swiatekm
Copy link

swiatekm commented Mar 8, 2024

Metrics E2E tests fail because we have some new metrics. We should probably drop the buckets and keep the rest. Maybe it's fine to keep them all? Depends on how many timeseries this is in practice per Pod.

Logs E2E tests fail due to open-telemetry/opentelemetry-collector-contrib#30797 changing the behaviour of the recombine operator. This is more serious and can't easily be worked around.

@kasia-kujawa
Copy link
Contributor Author

Note about new metrics exposed by otelcol from tests:

Test_Helm_Default_OT_NamespaceOverride/metrics/expected_metrics_are_present 2024-03-11T09:57:43Z retry.go:103
WaitUntilExpectedMetricsPresent() returned an error: found the following unexpected metrics: 
[otelcol_http_client_response_size
 otelcol_http_client_duration_count 
 otelcol_http_server_response_size 
 otelcol_http_client_request_size 
 otelcol_http_client_duration_bucket 
 otelcol_http_server_request_size 
 otelcol_http_client_duration_sum]. 
Sleeping for 3s and will try again.

@kasia-kujawa
Copy link
Contributor Author

otelcol_http_client* metrics are exposed by logs and metrics collectors, in default configuration I see additional 20 times series for these metrics:

  • otelcol_http_client_response_size
  • otelcol_http_client_request_size
  • otelcol_http_client_duration_count
  • otelcol_http_client_duration_bucket
  • otelcol_http_client_duration_sum
Screenshot 2024-03-11 at 15 16 52 Screenshot 2024-03-11 at 15 17 12 Screenshot 2024-03-11 at 15 18 31

new otelcol_http_server* metrics are exposed by otelcol logs and otelcol metrics

  • otelcol_http_server_request_size
  • otelcol_http_server_response_size
Screenshot 2024-03-11 at 15 31 53

@swiatekm-sumo in default configuration otelcol_http_server_duration_bucket metric is collected (accepted in tests ) so why not collect otelcol_http_client_duration_bucket? 🤔

Copy link

@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This pull request contains invalid labels. Please remove all of the following labels: ['do-not-merge/hold']

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants