Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

problem with the unreliable approximation of Core.Compiler.return_type #35800

Open
stev47 opened this issue May 8, 2020 · 19 comments
Open

problem with the unreliable approximation of Core.Compiler.return_type #35800

stev47 opened this issue May 8, 2020 · 19 comments
Assignees
Labels
bug Indicates an unexpected problem or unintended behavior compiler:inference Type inference

Comments

@stev47
Copy link
Contributor

stev47 commented May 8, 2020

Using Julia master (newer than 301db97), the following does not infer the correct type (see also the discussion in #34048):

using Test, LinearAlgebra
# @inferred mapreduce(norm, +, [rand(1)]);
@inferred mapreduce(norm, +, [rand(1)]; init = 0.);

Note that the third line works if you comment out the second line (start from a fresh Julia session), which makes testing this issue difficult.

@tkf tkf added the compiler:inference Type inference label May 8, 2020
@mbauman
Copy link
Sponsor Member

mbauman commented May 8, 2020

Possibly related to #35813?

@JeffBezanson
Copy link
Sponsor Member

Reduced:

julia> using LinearAlgebra

julia> X = [randn(n,n) for n = 1:5];

julia> @code_typed Base.mapfoldl(norm, +, X, init=0.)
CodeInfo(
1 ─ %1 = invoke Base.foldl_impl($(QuoteNode(Base.MappingRF{typeof(norm),Base.BottomRF{typeof(+)}}(LinearAlgebra.norm, Base.BottomRF{typeof(+)}(+))))::Base.MappingRF{typeof(norm),Base.BottomRF{typeof(+)}}, _2::NamedTuple{(:init,),Tuple{Float64}}, _6::Array{Array{Float64,2},1})::Any
└──      return %1
) => Any

julia> code_typed(Base.foldl_impl, (Base.MappingRF{typeof(norm),Base.BottomRF{typeof(+)}}, NamedTuple{(:init,),Tuple{Float64}}, Array{Array{Float64,2},1}))
1-element Array{Any,1}:
 CodeInfo(
1 ─ %1 = Base.getfield(nt, :init)::Float64
│   %2 = invoke Base._foldl_impl(_2::Base.MappingRF{typeof(norm),Base.BottomRF{typeof(+)}}, %1::Float64, _4::Array{Array{Float64,2},1})::Float64
└──      goto #3 if not false
2 ─      nothing::Nothing
3 ┄      return %2
) => Float64

So mapfold is Any, but when we look at the method it calls that gets inferred as Float64.

@JeffBezanson
Copy link
Sponsor Member

Possibly related to #35813?

I don't know what's going on yet, but I would guess not, since specialization is supposed to be orthogonal to inference; it only affects what we generate code for, not what we know about types.

@tkf
Copy link
Member

tkf commented May 8, 2020

I think a bigger problem is the context dependency of @inferred result. Maybe some kind of cache problem? I thought it was considered a bug if the type inference is not deterministic.

(I'm just mentioning it since, IIUC, Cthulhu.jl had a similar "bug" and at some point it was fixed.)

@JeffBezanson
Copy link
Sponsor Member

Yes, everybody agrees this is a bug.

@tkf
Copy link
Member

tkf commented May 8, 2020

Thanks. I was commenting because your comment #35800 (comment) seems to be focusing on a single call of @code_typed rather than @inferred mapreduce(norm, +, [rand(1)]); and then @inferred mapreduce(norm, +, [rand(1)]; init = 0.);.

@tkf
Copy link
Member

tkf commented May 9, 2020

Another "context dependent" behavior in #35537

@baggepinnen
Copy link
Contributor

Another very strange inference problem has popped up in JuliaSIMD/LoopVectorization.jl#114
only appears on some architectures and for Float32 but not Float64.

@chriselrod
Copy link
Contributor

chriselrod commented May 12, 2020

In case it helps, that inference failure was solved by replacing this:

@generated function vzero(::Type{Vec{W,T}}) where {W,T}
    typ = llvmtype(T)
    vtyp = "<$W x $typ>"
    instrs = """
    ret $vtyp zeroinitializer
    """
    quote
        $(Expr(:meta,:inline))
        Base.llvmcall($instrs, Vec{$W,$T}, Tuple{}, )
    end
end
@inline vzero(::Val{W}, ::Type{T}) where {W,T} = SVec(vzero(Vec{W,T}))

with

@generated function vzero(::Val{W}, ::Type{T}) where {W,T}
    typ = llvmtype(T)
    vtyp = "<$W x $typ>"
    instrs = """
    ret $vtyp zeroinitializer
    """
    quote
        $(Expr(:meta,:inline))
        SVec(Base.llvmcall($instrs, Vec{$W,$T}, Tuple{}, ))
    end
end

EDIT:
What would be easier on inference: generated functions like the above, or a bunch of different methods on concrete types (e.g., created in a loop with @eval)?
Would it help to have generated functions as fall backs, and make concrete versions for the most common use cases?
It seems like that may be better at avoiding heuristics that make inference give up.

@JeffBezanson
Copy link
Sponsor Member

A bit more information:

First this is called:

  foldl_impl(Base.MappingRF{typeof(LinearAlgebra.norm),
                            Base.BottomRF{typeof(Base.:(+))}}, Float64, Array{Array{Float64, 2}, 1})

which eventually calls norm, which calls

  foldl_impl(Base.MappingRF{getfield(Base, Symbol("#208#209")){getfield(Base, Symbol("#62#63")){typeof(Base.iszero)}},
                            Base.BottomRF{typeof(Base.add_sum)}}, Int64, Array{Float64, 2})

which appears to be recursion, so it's widened to

  foldl_impl(Base.MappingRF{F, T} where T where F, Int64, Array{Float64, 2})

which infers to Any. However, the un-widened signature does not lead to recursion (since the recursion is via norm and not just foldl_impl), so if you infer that on its own it's fine.

@JeffBezanson
Copy link
Sponsor Member

The order dependence happens because if we have already inferred e.g. norm, then we just use the cached result instead of looking through and seeing the recursion.

@JeffBezanson JeffBezanson added the bug Indicates an unexpected problem or unintended behavior label Aug 18, 2020
JeffBezanson added a commit that referenced this issue Aug 18, 2020
@KristofferC KristofferC mentioned this issue Aug 19, 2020
25 tasks
KristofferC pushed a commit that referenced this issue Aug 19, 2020
@JeffBezanson
Copy link
Sponsor Member

This is a fairly deep issue, and there is no way to promise fixing it within the 1.6 timeframe.

andreasnoack added a commit to andreasnoack/SciMLBase.jl that referenced this issue Mar 31, 2021
JuliaLang/julia#35800

which can cause very slow compile times in Julia 1.6.
@timholy
Copy link
Sponsor Member

timholy commented Jan 30, 2022

This seems to work properly on current master. If others agree, perhaps we can close.

@JeffBezanson
Copy link
Sponsor Member

Well, if we're "lucky" it might work, but the underlying issue is not fixed 😄

@aviatesk aviatesk self-assigned this Apr 13, 2022
@aviatesk
Copy link
Sponsor Member

aviatesk commented Apr 13, 2022

The root problem here is unreliability of Core.Compiler.return_type approximation.
When inferring Core.Compiler.return_type, we need to avoid inference cycles and thus may end up with looser return type. Nevertheless inference may be able to derive more precise information later (e.g. when new inference runs on the same call graph after the previous cache is invalidated) if an actual runtime execution has derived a cache for a poorly-inferred callsite, since then the new inference Core.Compiler.return_type can then use a more precise information kept in that cache derived by dynamic execution.

For those who are interested in this issue, this reproducer (adapted from #35537) is nice to inspect on:

julia> x = [ones(3) for _ in 1:2];

julia> s1(x) = x ./ sum(x);

julia> testf(x) = s1.(x);

julia> Base.return_types(testf, (typeof(x),)) # poor inference
1-element Vector{Any}:
 Any

julia> method = only(methods(Base.Broadcast.combine_eltypes))
combine_eltypes(f, args::Tuple) in Base.Broadcast at broadcast.jl:717

julia> mi = Core.Compiler.specialize_method(method, Tuple{typeof(Base.Broadcast.combine_eltypes),typeof(s1),Tuple{Vector{Vector{Float64}}}}, Core.svec())
MethodInstance for Base.Broadcast.combine_eltypes(::typeof(s1), ::Tuple{Vector{Vector{Float64}}})

julia> mi.cache # no cache yet
ERROR: UndefRefError: access to undefined reference
Stacktrace:
 [1] getproperty(x::Core.MethodInstance, f::Symbol)
   @ Base ./Base.jl:37
 [2] top-level scope
   @ REPL[16]:1

julia> testf(x); # actual execution, create internal caches with precise type information

julia> s1(x) = x ./ sum(x); # invalidate the poorly inferred cache

julia> Base.return_types(testf, (typeof(x),)) # woa!
1-element Vector{Any}:
 Vector{Vector{Float64}} (alias for Array{Array{Float64, 1}, 1})

julia> mi.cache # now we have a cache
Core.CodeInstance(MethodInstance for Base.Broadcast.combine_eltypes(::typeof(s1), ::Tuple{Vector{Vector{Float64}}}), Core.CodeInstance(MethodInstance for Base.Broadcast.combine_eltypes(::typeof(s1), ::Tuple{Vector{Vector{Float64}}}), #undef, 0x0000000000007f6d, 0x0000000000007f6e, Type{Vector{Float64}}, Vector{Float64}, nothing, 0x000000aa, 0x000000aa, nothing, false, false, Ptr{Nothing} @0x000000010165db10, Ptr{Nothing} @0x0000000000000000, 0x00), 0x0000000000007f6f, 0xffffffffffffffff, Type{Vector{Float64}}, Vector{Float64}, nothing, 0x000000aa, 0x000000aa, nothing, false, false, Ptr{Nothing} @0x000000010165db10, Ptr{Nothing} @0x0000000000000000, 0x00)

julia> mi.cache.rettype
Type{Vector{Float64}}

@aviatesk aviatesk changed the title sporadic type instability with mapreduce / norm problem with the unreliable approximation of Core.Compiler.return_type Apr 13, 2022
@gabrevaya
Copy link

Hi! I found something strange (at least for me) and hope it helps towards finding the cause of the issue. I apologize this turns out to be something trivial and unuseful (this is the first time I dig into the inference system).

I get the following results for the reproducer only when using Julia 1.8.0-beta3 and loading DifferentialEquations.jl (which by chance I had loaded when I first run this reproducer). If I don't load DifferentialEquations.jl or use Julia versions 1.6.6 or 1.7.2, I get the same results @aviatesk showed.

julia> using DifferentialEquations

julia> x = [ones(3) for _ in 1:2];

julia> s1(x) = x ./ sum(x);

julia> testf(x) = s1.(x);

julia> Base.return_types(testf, (typeof(x),)) # in this case the inference seems to work
1-element Vector{Any}:
 Vector{Vector{Float64}} (alias for Array{Array{Float64, 1}, 1})

julia> method = only(methods(Base.Broadcast.combine_eltypes))
combine_eltypes(f, args::Tuple) in Base.Broadcast at broadcast.jl:717

julia> mi = Core.Compiler.specialize_method(method, Tuple{typeof(Base.Broadcast.combine_eltypes),typeof(s1),Tuple{Vector{Vector{Float64}}}}, Core.svec())
MethodInstance for Base.Broadcast.combine_eltypes(::typeof(s1), ::Tuple{Vector{Vector{Float64}}})

julia> mi.cache
Core.CodeInstance(MethodInstance for Base.Broadcast.combine_eltypes(::typeof(s1), ::Tuple{Vector{Vector{Float64}}}), #undef, 0x0000000000007f0e, 0xffffffffffffffff, Type{Vector{Float64}}, Vector{Float64}, nothing, 0x00000180, 0x00000180, nothing, false, false, Ptr{Nothing} @0x000000011027f900, Ptr{Nothing} @0x0000000000000000, 0x00)

julia> mi.cache.rettype
Type{Vector{Float64}}

julia> testf(x); # actual execution, create internal caches with precise type information

julia> s1(x) = x ./ sum(x); # invalidate the previous inferred cache

julia> Base.return_types(testf, (typeof(x),))
1-element Vector{Any}:
 Vector{Vector{Float64}} (alias for Array{Array{Float64, 1}, 1})

julia> mi.cache # notice that this cache is different from the previous one
Core.CodeInstance(MethodInstance for Base.Broadcast.combine_eltypes(::typeof(s1), ::Tuple{Vector{Vector{Float64}}}), Core.CodeInstance(MethodInstance for Base.Broadcast.combine_eltypes(::typeof(s1), ::Tuple{Vector{Vector{Float64}}}), #undef, 0x0000000000007f0e, 0x0000000000007f0f, Type{Vector{Float64}}, Vector{Float64}, nothing, 0x00000180, 0x00000180, nothing, false, false, Ptr{Nothing} @0x000000011027f900, Ptr{Nothing} @0x0000000000000000, 0x00), 0x0000000000007f10, 0xffffffffffffffff, Type{Vector{Float64}}, Vector{Float64}, nothing, 0x00000180, 0x00000180, nothing, false, false, Ptr{Nothing} @0x000000011027f900, Ptr{Nothing} @0x0000000000000000, 0x00)

julia> mi.cache.rettype
Type{Vector{Float64}}

julia> VERSION
v"1.8.0-beta3"

(test_return_type) pkg> st
Status `~/test_return_type/Project.toml`
  [0c46a032] DifferentialEquations v7.1.0

@gabrevaya
Copy link

Well, if we're "lucky" it might work, but the underlying issue is not fixed 😄
#35800 (comment)

Hi! I also tried most of the examples of this issue in Julia 1.8.0-rc1 and seems to work well. I don't know if it is still just luck or if the bug has been already fixed. If not, what are the prospects for a fix? This bug might be preventing me from going ahead with some research (without having to go back to Python which I'm trying to resist as hard as I can 😄)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Indicates an unexpected problem or unintended behavior compiler:inference Type inference
Projects
None yet
Development

No branches or pull requests