-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
vrl-parser is a dependency of vrl-core and takes 130 seconds to build in a 150 second build #9547
Comments
Out of an abundance of curiosity I applied this diff to the project:
and find that the generated |
This makes me curious as to what those lines are doing? |
A good deal of them seem to be lookup tables and massive match statements, lookup structures of some other sort. |
One thing that has crept in over time, is that we added more I'll assign myself to this issue so that I can work on it somewhere in the next couple of weeks. |
Here's an example on how we can solve this (or at least, how we can shrink the generated code, I'm unsure how significant the difference will be, but it is listed as a possible solution to code generation bloat): pub Top: Top = {
"StartGrammar" <Grammar> => Top::Grammar(<>),
"StartPattern" <Pattern> => Top::Pattern(<>),
"StartMatchMapping" <MatchMapping> => Top::MatchMapping(<>),
"StartTypeRef" <TypeRef> => Top::TypeRef(<>),
"StartGrammarWhereClauses" <GrammarWhereClauses> => Top::GrammarWhereClauses(<>),
}; |
Also note that I already did apply this technique for our testing infrastructure: vector/lib/vrl/parser/src/parser.lalrpop Lines 99 to 131 in 9774570
We basically have to do this for all of our |
Also, just to make sure we're on the same page; this should only happen on the first build, it shouldn't re-generate the parser on incremental builds as long as the grammar isn't updated, although I realize this isn't much of a solution for situations when you want to do a fresh build (or when you have no choice, such as in some CI set-ups). |
That’s true but in practice we hit conditions that make rebuilds likely, primarily on account of the dependency relationships in the project, I think. Like you say clean builds and CI builds are often doing a vrl-parser build but any build that changes feature flags will have to as well. It’s typical for core work to be done with a small set of flags on top of no-default to shave overall build time and the soak test revamp will follow the same pattern.
I’m hopeful that with caching we can get some incremental builds going across PRs but I expect differing feature flag sets will make it unlikely. Whence the concern over dependencies and build times I’ve showed recently.
I appreciate your both looking at this. Getting the parser build time shaved down would be a big help.
…On Tue, Oct 12, 2021, at 03:32, Jean Mertz wrote:
Also, just to make sure we're on the same page; this should only happen on the *first build*, it shouldn't re-generate the parser on incremental builds as long as the grammar isn't updated, although I realize this isn't much of a solution for situations when you *want* to do a fresh build (or when you have no choice, such as in some CI set-ups).
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub <#9547 (comment)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/AAAA36YBGTYE7GOPVBKALFDUGQFFVANCNFSM5FVSUVMQ>.
Triage notifications on the go with GitHub Mobile for iOS <https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675> or Android <https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub>.
|
This commit introduces a matrixed workflow to build images for soak tests. It is not dynamic but does function. Build times are on the order of 10 minutes per soak and we might want to consider whether caching between soaks can be improved, though they all have different feature flags in play. Improving #9547 would also help quite a bit here. Closes #9531 Signed-off-by: Brian L. Troutwine <brian@troutwine.us>
This commit introduces a matrixed workflow to build images for soak tests. It is not dynamic but does function. Build times are on the order of 10 minutes per soak and we might want to consider whether caching between soaks can be improved, though they all have different feature flags in play. Improving #9547 would also help quite a bit here. Closes #9531 Signed-off-by: Brian L. Troutwine <brian@troutwine.us>
This commit introduces a matrixed workflow to build images for soak tests. It is not dynamic but does function. Build times are on the order of 10 minutes per soak and we might want to consider whether caching between soaks can be improved, though they all have different feature flags in play. Improving #9547 would also help quite a bit here. Closes #9531 Signed-off-by: Brian L. Troutwine <brian@troutwine.us>
These are both VRL dependencies, which are no longer in this repo. The compilation time for VRL has also improved since then, so I'm going to close this. |
While working on #9531 I've found several opportunities to either remove dependencies or shuffle them around in the build order to improve build speed. See that ticket for details. When building Vector with
cargo +nightly build -Z timings --no-default-features
from 89e2b08 I find that the total build time on my system is 150 seconds and 130 seconds of that isvrl-parser
. I haven't investigated why this crate is slow to build -- though lalrpop/lalrpop#65 and lalrpop/lalrpop#260 probably give an indication -- but it dominates build time.Top-level vector requires
vector-core/vrl
which in turn depends onvrl-core
. The dependency tree of vrl has a diamond present in it:Reordering VRL internals to remove this diamond or flag out pieces of it that are not necessary unless we're actually parsing VRL in calling code would substantially improve our build speed. We might also avoid whatever makes the parser so slow to build, but that's probably a larger project considering the open tickets linked above.
The text was updated successfully, but these errors were encountered: