Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Significant performance regression on the encoding benchmark #69197

Closed
ecstatic-morse opened this issue Feb 15, 2020 · 12 comments
Closed

Significant performance regression on the encoding benchmark #69197

ecstatic-morse opened this issue Feb 15, 2020 · 12 comments
Labels
I-compiletime Issue: Problems and improvements with respect to compile times. P-high High priority regression-from-stable-to-nightly Performance or correctness regression from stable to nightly. T-compiler Relevant to the compiler team, which will review and decide on the PR/issue.

Comments

@ecstatic-morse
Copy link
Contributor

ecstatic-morse commented Feb 15, 2020

#69144 (a rollup) slowed down check builds of the encoding benchmark considerably, see the task-clock measurements. This degradation has been consistent over the last half-dozen perf runs, so it is not spurious. The following candidates have non-trivial changes to the parts of the compiler that are run during a check build.

I doubt it is #68938 or #69126 so cc @Centril @cjgillot

@jonas-schievink jonas-schievink added I-compiletime Issue: Problems and improvements with respect to compile times. regression-from-stable-to-nightly Performance or correctness regression from stable to nightly. T-compiler Relevant to the compiler team, which will review and decide on the PR/issue. labels Feb 15, 2020
@Centril
Copy link
Contributor

Centril commented Feb 15, 2020

#69057 is mostly a pure refactoring so it seems unlikely?

@Centril
Copy link
Contributor

Centril commented Feb 15, 2020

Besides changes in https://github.com/rust-lang/rust/pull/68728/files#diff-f5c049612f32b4eda9adb5019ab10a83, nothing pops out in the parsing PR either.

bors added a commit that referenced this issue Feb 15, 2020
Revert #69108

... to see if it caused the regression in #69197.

@bors try
@rust-timer queue
@ecstatic-morse
Copy link
Contributor Author

Ruled out #69108 via perf results from #69199, so it's likely the big one (#68728) cc @Centril @petrochenkov.

@ecstatic-morse
Copy link
Contributor Author

ecstatic-morse commented Feb 17, 2020

Some perf builds around the regression were stuck in the queue when I posted this, so it may not be #69144. The regression occurs in early_lint_checks.

@pnkfelix
Copy link
Member

pnkfelix commented Feb 20, 2020

nominating for discussion at T-compiler meeting for prioritization.

@pnkfelix
Copy link
Member

discussed at T-compiler meeting. P-high. Removing nomination.

@pnkfelix pnkfelix added P-high High priority and removed I-nominated labels Feb 20, 2020
@ecstatic-morse
Copy link
Contributor Author

ecstatic-morse commented Feb 26, 2020

A significant portion of the runtime for check builds of encoding is spent doing system calls (around 20% on the latest nightly). I believe this is because error messages are written directly to io::Stderr which does not buffer, which is supported by the perf results in #69227. Although there's a lot of variance, older nightlies appear to spend less time doing system calls than current ones, which could suggest that changes to diagnostics or diagnostics reporting could be the root cause of this issue.

Roughly the same number of errors are reported in old (end of 2019) nightlies as recent ones (~1900 lines output to stderr). However, earlier nightlies use many more calls to write(2, ...) to accomplish this (3500 vs 720). Given this, I would assume that the earlier nightlies would be slower than the current ones on the encoding benchmark, not faster. Perhaps someone else is more knowledgeable.

@ecstatic-morse
Copy link
Contributor Author

After #69227, the task-clock measurements for encoding-check are back to normal. I think we should close this, although I wish I had a better idea of when/why the regression occurred.

@mati865
Copy link
Contributor

mati865 commented Mar 2, 2020

@ecstatic-morse
Copy link
Contributor Author

@mati865 How would that PR cause regressions in check builds? And why would the regression appear in only one crate?

@mati865
Copy link
Contributor

mati865 commented Mar 3, 2020

No idea, that's what perf shows 🤷‍♂

@pnkfelix
Copy link
Member

pnkfelix commented Mar 5, 2020

closing as resolved by PR #69227

@pnkfelix pnkfelix closed this as completed Mar 5, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
I-compiletime Issue: Problems and improvements with respect to compile times. P-high High priority regression-from-stable-to-nightly Performance or correctness regression from stable to nightly. T-compiler Relevant to the compiler team, which will review and decide on the PR/issue.
Projects
None yet
Development

No branches or pull requests

5 participants