-
-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Slow compilation and segfault with recursion on highly nested struct #30744
Comments
Wow, this seems to involve a type so large we basically can't (or shouldn't) generate specialized code for it. Is this supposed to be a pathological case, or is it fairly normal transducer code? With |
This is a useful example that I hope I can support. But indeed I'm a bit lazy in Transducers.jl internal not trying hard enough to make the struct shallower 😅 (I'm planning on doing that). I tried using a flat tuple instead of a "linked list" style nested struct but it seemed that inference gave up determining a concrete type well when I constantly re-creating tuples. Actually, I have a question: Is it better to use "flat" style like |
Just to clarify a bit more, a call chain (top to bottom) of the flat style would be: next(::Nodes{Tuple{Node1{T1}, Node2{T2}, Node3{T3}}}, ...)
next(::Nodes{Tuple{Node2{T2}, Node3{T3}}}, ...)
next(::Nodes{Tuple{Node3{T3}}}, ...) i.e., each On the other hand, a call chain of the linked list style would be: next(::Node1{T1, Node2{T2, Node3{T3, Nothing}}}, ...)
next(::Node2{T2, Node3{T3, Nothing}}, ...)
next(::Node3{T3, Nothing}, ...) It's nicer since I don't have to re-construct the structs. But I worry if it is not compiler-friendly. |
The flat style is probably better. We have moved to that in Base.Iterators, for example, for Zip and Product. |
Thanks. Hmm.... So I guess I have to try flat style again at some point. But it looked like to me that the inference in linked list style was much better than the flat one. Or maybe rather the compiler was "forced" to continue inference because there is no limit? Or maybe simply my code had performance bugs... Anyway, I suppose that the plan for this issue is that keeping it open until there is some inference/inline limit. (So I'm not touching the close button ATM.) |
It's quite possible there are things we can improve in inference to make the "flat" approach work better. As always, the answer is to file more issues :) |
I can do that 👍 (But, to be honest, I personally prefer if julia handle recursive data structure as good as tuples.) |
Continuing from #30125 (comment)
@JeffBezanson Thanks a lot for looking into this. Actually, it turned out the script eventually segfaults if I wait long enough (3 to 5 minutes, at least in some machines; I'll try to find the difference in setup). This should reproduce the issue:
If I run
julia
with--compile=min
or more precisely withjulia --startup-file=no --compiled-modules=no --compile=min -e 'using Transducers; include(joinpath(dirname(dirname(pathof(Transducers))), "examples", "tutorial_missings.jl"))'
the output is something likebut without
--compile=min
the script should hang after printingans = (2, 3.0)
.Only the last section of the script is needed to demonstrate the slow compilation. So a smaller example than
tutorial_missings.jl
would be:When it segfaults, the output is something like:
The text was updated successfully, but these errors were encountered: