Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[receiver/prometheus] Summary metrics without quantiles cause scrape errors #23995

Closed
swiatekm opened this issue Jul 5, 2023 · 4 comments
Closed
Labels
bug Something isn't working closed as inactive receiver/prometheus Prometheus receiver Stale

Comments

@swiatekm
Copy link
Contributor

swiatekm commented Jul 5, 2023

Component(s)

receiver/prometheus

What happened?

Description

Scraping a Summary type metric without a quantile label causes the Prometheus receiver to error and fail the whole scrape.

Expected Result

The metric should be ingested as-is.

Collector version

0.80.0

Additional context

Originally discussed in #22070

@swiatekm swiatekm added bug Something isn't working needs triage New item requiring triage labels Jul 5, 2023
@github-actions github-actions bot added the receiver/prometheus Prometheus receiver label Jul 5, 2023
@github-actions
Copy link
Contributor

github-actions bot commented Jul 5, 2023

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@swiatekm swiatekm changed the title [receiver/prometheus] Histograms without buckets are dropped [receiver/prometheus] Summary metrics without quantiles cause scrape errors Jul 5, 2023
@dashpole dashpole removed the needs triage New item requiring triage label Jul 26, 2023
@swiatekm
Copy link
Contributor Author

swiatekm commented Aug 1, 2023

@dashpole I think we originally had a misunderstanding about this, in that this was a problem for me not just for the _sum and _count parts, but for all the samples. That is obviously against the Otel spec, so it's correct that the receiver rejects it. #24030 helps with this in that the valid samples in a given scrape can still be collected, but it still makes migrations from Prometheus to Otel a bit more difficult.

If we want to consider trying to accept metrics which aren't up to spec (perhaps as type Unknown?), then we can discuss that in a separate issue and close this one. Is that ok with you?

@github-actions
Copy link
Contributor

github-actions bot commented Oct 2, 2023

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Oct 2, 2023
Copy link
Contributor

github-actions bot commented Dec 1, 2023

This issue has been closed as inactive because it has been stale for 120 days with no activity.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Dec 1, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working closed as inactive receiver/prometheus Prometheus receiver Stale
Projects
None yet
Development

No branches or pull requests

2 participants