Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make the batch processor limit data points rather than metrics. #3141

Merged

Conversation

kisieland
Copy link
Contributor

Description:
Changes batch processor logic to limit data points rather than metrics.

Link to tracking Issue: #2754

Testing:
Ran go test ./...
I've added some edge case test and alerted existing tests.

@kisieland kisieland requested a review from a team May 11, 2021 15:19
@kisieland
Copy link
Contributor Author

@dashpole @bogdandrutu can you take a look?

@dashpole
Copy link
Contributor

This was approved during the 3/24 sig meeting: #2754 (comment)

@dashpole
Copy link
Contributor

I think this needs a release note in the CHANGELOG.md under breaking changes

@kisieland
Copy link
Contributor Author

kisieland commented May 11, 2021

I think this needs a release note in the CHANGELOG.md under breaking changes

Done.
I guess my editor removed trailing spaces from other lines.

CHANGELOG.md Outdated Show resolved Hide resolved
consumer/pdata/metric.go Outdated Show resolved Hide resolved
processor/batchprocessor/README.md Outdated Show resolved Hide resolved
@bogdandrutu
Copy link
Member

@dashpole please help us to ensure we don't expose unnecessary public APIs if possible, any public API will cost us in the future.

@kisieland kisieland force-pushed the change-batch-processing branch 2 times, most recently from 88ffead to 3f4a67b Compare May 12, 2021 07:50
@tigrannajaryan
Copy link
Member

Is there any measurable performance impact due to this change?

@kisieland
Copy link
Contributor Author

kisieland commented May 13, 2021

@tigrannajaryan I've added new version of benchmark, It looks like there is about 20% increase on the time it takes to process data. But, as the batch does not take a significant time of the pipeline we can live with this, WDYT?
(Feel free to re-run and verify it)
On main branch I got:

$ GOMAXPROCS=12 go test -bench=BenchmarkBatchMetricProcessor -benchtime=20000x
goos: linux
goarch: amd64
pkg: go.opentelemetry.io/collector/processor/batchprocessor
cpu: Intel(R) Xeon(R) CPU @ 2.30GHz
BenchmarkBatchMetricProcessor-12           20000             78924 ns/op
PASS
ok      go.opentelemetry.io/collector/processor/batchprocessor  22.330s

On change-batch-processing I've got:

$ GOMAXPROCS=12 go test -bench=BenchmarkBatchMetricProcessor -benchtime=20000x
goos: linux
goarch: amd64
pkg: go.opentelemetry.io/collector/processor/batchprocessor
cpu: Intel(R) Xeon(R) CPU @ 2.30GHz
BenchmarkBatchMetricProcessor-12           20000             94004 ns/op
PASS
ok      go.opentelemetry.io/collector/processor/batchprocessor  22.502s

NOTE: This comment was edited, once I updated the Benchmark to exclude data initialisation. With the data initialisation included there was no significant differences.

@dashpole
Copy link
Contributor

@jrcamp @Aneurysm9 It would be helpful to have your reviews as well, if you have time.

CHANGELOG.md Show resolved Hide resolved
processor/batchprocessor/README.md Outdated Show resolved Hide resolved
@kisieland kisieland requested a review from Aneurysm9 May 20, 2021 10:16
@kisieland
Copy link
Contributor Author

@bogdandrutu @Aneurysm9 please take a look

CHANGELOG.md Show resolved Hide resolved
processor/batchprocessor/README.md Outdated Show resolved Hide resolved
@tigrannajaryan
Copy link
Member

Please resolve the merge conflicts.

@kisieland
Copy link
Contributor Author

Merge conflicts resolved.
@tigrannajaryan @bogdandrutu please take a look, I would appreciate if this PR was included in next release.

Copy link
Member

@tigrannajaryan tigrannajaryan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks @kisieland

@tigrannajaryan tigrannajaryan dismissed bogdandrutu’s stale review May 31, 2021 15:17

All requested changes addressed

@tigrannajaryan
Copy link
Member

@bogdandrutu PTAL, I believe all your comments are addressed and we can merge.

@tigrannajaryan
Copy link
Member

@kisieland please squash the commits and add a descriptive commit message. Please also include the results of before/after benchmarking in the commit message.

@kisieland
Copy link
Contributor Author

@tigrannajaryan Done

@kisieland
Copy link
Contributor Author

@bogdandrutu PTAL

…f metrics.

This change introduces a BenchmarkBatchMetricProcessor that stress tests batching logic.
Results before:
`BenchmarkBatchMetricProcessor-12           20000             80614 ns/op`
Results after the change:
`BenchmarkBatchMetricProcessor-12           20000             96184 ns/op`
@tigrannajaryan
Copy link
Member

@kisieland please fix the changelog conflict one more time (sorry).

@kisieland
Copy link
Contributor Author

@tigrannajaryan I don't see any conflicts, I resolved them two times today.

Copy link
Member

@bogdandrutu bogdandrutu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Small optimization to avoid creating a new metric then copy it to destination, but directly copy metadata and points to the destination metric.

processor/batchprocessor/splitmetrics.go Outdated Show resolved Hide resolved
@kisieland
Copy link
Contributor Author

@bogdandrutu Please merge if everything is ok (merging is blocked for me)

@tigrannajaryan tigrannajaryan merged commit 6d44f0d into open-telemetry:main Jun 2, 2021
@tigrannajaryan
Copy link
Member

@kisieland Thank you for the contribution and for the patience.

@kisieland kisieland deleted the change-batch-processing branch June 9, 2021 10:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants