We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
On Advent of Code 2020, Day 15, part 2. I'm trying to benchmark things in the 1s range in performance
runner: 1.159830989s
With the default criterion config that produces an estimate of:
Benchmarking Day15 - Part2/alt: Collecting 100 samples in estimated 6081.4 s (5050 iterations)
This is quite a long time to wait. Ideally, we would be able to annotate the individual test to override the default criterion config.
I'm able to work around this issue by editing target/aoc/aoc-autobench/benches/aoc_benchmark.rs and changing:
target/aoc/aoc-autobench/benches/aoc_benchmark.rs
criterion_group!(benches, aoc_benchmark);
into
criterion_group! { name = benches; config = Criterion::default().significance_level(0.05).sample_size(15); targets = aoc_benchmark }
The text was updated successfully, but these errors were encountered:
No branches or pull requests
On Advent of Code 2020, Day 15, part 2. I'm trying to benchmark things in the 1s range in performance
With the default criterion config that produces an estimate of:
This is quite a long time to wait. Ideally, we would be able to annotate the individual test to override the default criterion config.
I'm able to work around this issue by editing
target/aoc/aoc-autobench/benches/aoc_benchmark.rs
and changing:into
The text was updated successfully, but these errors were encountered: