-
-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make promote_op rely on Core.Inference.return_type #17389
Conversation
@martinholters After this, you should be able to use |
for (iF, iB) in zip(eachindex(F), eachindex(B)) | ||
@inbounds F[iF] = ($f)(A, B[iB]) | ||
end | ||
return F | ||
end | ||
function ($f){T}(A::AbstractArray{T}, B::Number) | ||
F = similar(A, promote_array_type($f,typeof(A),typeof(B))) | ||
F = similar(A, promote_op($f, T, typeof(B))) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
promote_array_type
is responsible for e.g. typeof(Float32[1,2,3]*1.0) == Vector{Float32}
, even though Base.promote_op(*, Float32, typeof(1.0)) == Float64
and typeof(Float32(1)*1.0) == Float64
. If you look at above machinery closer, you'll note that the result of promote_op
is only used in a few cases, and that's for a reason.
How is that? The reason for not not using |
Sorry, I overlooked that in #17313.
Well, this should make |
f1b0edf
to
2464ebc
Compare
There's been resistance in the past to making program behavior (as opposed to just performance) depend on the results of inference. |
I understand that sentiment, but something needs to be done about |
The ideal situation is to use the inference result when it is concrete, and otherwise use the types of the actual values. Then there is only a problem in the empty case. Using the inferred type there is deemed acceptable for now. |
4a48215
to
3803ee4
Compare
@JeffBezanson Does the current proposal address your concerns (at least provisionally and until we have a better mechanism)? Tests are now passing on AV (at the moment Travis has a very long queue) and I don't think this should affect performance in general. Of course, that should be checked. |
@@ -164,8 +164,6 @@ m = [1:2;]' | |||
@test @inferred([0,1.2].+reshape([0,-2],1,1,2)) == reshape([0 -2; 1.2 -0.8],2,1,2) | |||
rt = Base.return_types(.+, Tuple{Array{Float64, 3}, Array{Int, 1}}) | |||
@test length(rt) == 1 && rt[1] == Array{Float64, 3} | |||
rt = Base.return_types(broadcast, Tuple{typeof(.+), Array{Float64, 3}, Array{Int, 3}}) | |||
@test length(rt) == 1 && rt[1] == Array{Float64, 3} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why is this deleted?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
With these changes broadcast wouldn't be inferable (as map) but it will return the correct type nevertheless. I guess I should change it to at least verify that. <- Done
950492e
to
85d5f22
Compare
b70468b
to
96e364a
Compare
This isn't doing the latter, it's returning |
@tkelman You're right, but I believe the cases where the type returned by Actually, this is more in analogy with the If the first statement above is not accurate I can change the PR to accommodate Jeff's initial comment. |
It'll be funny if we introduce another type inference dependent behavior for non-empty cases right after we worked really hard to fix one.... |
I don't think so. What was the comment about? |
I could have sworn that a few days ago I said I didn't think we should merge this until after we branch, though that was before seeing the issues raised in the last few days, and I need to look at the pkgeval results. |
This #16986 (comment)? I guess you misplaced it. |
I was thinking of #16986, nevermind, sorry. |
Sorry for the delay, took me a while to get in front of a proper keyboard. This turns out to be hugely breaking in packages: https://gist.github.com/8f19fefaf5021dbc076cc47b3ff8abd1 Could use a hand digging through those json logs (jq and/or JSON.jl are useful for this) to see if they're all the same failure though, maybe a re-run now that DataArrays has been tagged might fix them all? |
With DataArrays tagged it's a much smaller number of failures now. https://gist.github.com/5049d028f493ef2a9163a269f9a634fc (Reactive has flaky tests, and apparently this PR actually fixes SampledSignals http://pkg.julialang.org/detail/SampledSignals.html) |
|
Deprecation warnings? I fixed deprecations now so maybe try again (with newest commit). |
PkgEval runs on tagged package versions. |
Ok, I tag then. |
The NearestNeighbors one is fixed (locally) for me by fixing the deprecations as @KristofferC pointed out. On the other hand I'm not that sure how this is affecting GaussianMixtures, it passed the first time, but it errored the second (on a function that has nothing to do with DataArrays). Also, it passes the tests on my machine. I suspect the memory used fluctuates around the edges of the memory of the testing machine, but I'm not sure. |
Ah right it's an |
I don't. But you already merged it, so I guess there is no problem. |
There might be a problem, but I'll take one fixed package (plus 2 about-to-be-fixed packages) in exchange for one broken one. cc @davidavdav |
That might have been fixed by JuliaGizmos/Reactive.jl@d7a2bd3, which was tagged just a couple of days ago. |
if isdefined(Core, :Inference) | ||
function _promote_op(op, T::Type) | ||
G = Tuple{Generator{Tuple{T},typeof(op)}} | ||
R = Core.Inference.return_type(first, G) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does return_type(op, Tuple{T})
not work? This seems roundabout.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That was the first thing I tried, the problem is that Test.@inferred
does not work over _promote_op
that way, but it works with this.
[release-0.5] revert #17389 for 0.5.0-rc1
This should fix #17314 and the problem found in #17394
EDIT: This also removes all specialized
promote_op
methods, since the new ones should handle all previous cases.CC @JeffBezanson, @nalimilan, @tkelman