Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update for interface change in CoordinateTransformations #88

Closed
wants to merge 2 commits into from

Conversation

Cody-G
Copy link
Contributor

@Cody-G Cody-G commented Aug 21, 2018

CoordinateTransformations.jl recently went through some name changes that broke our code. The fix was easy and I was hoping to get it merged before the 0.7 upgrade (#87) However, somehow an affine transform optimization test is now failing for me locally. I don't see how that could be related to this PR, but it's probably worth figuring out what's changed before merging this. Maybe a change in a dependent package is to blame? But the only other dependencies I can see are QuadDIRECT, StaticArrays, and Images. Anyway the Travis failure from this PR may be a useful reference.

@Cody-G
Copy link
Contributor Author

Cody-G commented Aug 21, 2018

...and I just noticed that the test in question fails on the server but passes on my laptop. So I should be able to figure out which dependency it is. Will do that asap

@Cody-G
Copy link
Contributor Author

Cody-G commented Aug 21, 2018

Ah, Travis didn't get far enough to trigger the error that I mentioned. Recently ImageAxes was updated and it seems there's no tag for 0.6. Either we can ask them to make a tag or pin ImageAxes at this commit: JuliaImages/ImageAxes.jl@d30a726

@Cody-G
Copy link
Contributor Author

Cody-G commented Aug 21, 2018

I did find a set of package versions that passes tests, but things are so chaotic in the package ecosystem right now due to the 0.7 upgrade that I'll hold off on further changes for now. It would be nice if we could get 0.6 tests passing again before pursuing #87, but that may not be practical. We may instead just want to work on the 0.6 and 0.7 versions of BlockRegistration separately.

@timholy
Copy link
Member

timholy commented Aug 21, 2018

Yes, I think the best bet is to stick with an older version that works. Should we just put an upper bound on the version of ImageAxes and/or CoordinateTransformations in BlockRegistration? That way anyone who has BlockRegistration installed will effectively be pinned at the working version of those packages.

@Cody-G
Copy link
Contributor Author

Cody-G commented Aug 21, 2018

The CoordinateTransformations change is straightforward and should apply to 0.6 or to 0.7, so I don't think we need to pin it (as of now, but that may change). But ImageAxes doesn't currently have a version that works for us. If we go back to the last tagged 0.6 version we'll still have the axes naming conflict. They merged my fix for that but then also merged the 0.7 stuff without tagging. Or maybe you mean we should just pin to the specific commit? I'm not sure how to specify that in REQUIRE.

@timholy
Copy link
Member

timholy commented Aug 21, 2018

I'll tag that 0.6 version of ImageAxes pronto.

@timholy
Copy link
Member

timholy commented Aug 21, 2018

I tagged the commit that merged your PR. See JuliaLang/METADATA.jl#17177.

Does Compat really export axes? I thought that got dropped.

@Cody-G
Copy link
Contributor Author

Cody-G commented Aug 21, 2018

Great, thanks! Compat was exporting axes when I first opened that PR...it stayed open for many months. I'll update REQUIRE and we'll see how Travis does.

@Cody-G Cody-G force-pushed the cjg/coord_tfms branch 3 times, most recently from 4398c89 to b8c8a80 Compare August 22, 2018 14:09
@Cody-G
Copy link
Contributor Author

Cody-G commented Aug 22, 2018

This has taken a strange turn. I think I've pinned the packages correctly for 0.6 and you can see in the latest Travis run that we're getting to the failed test:

Running coarse step
Initial minimum (9 evaluations): Box0.0068573233718794225@[1.0, NaN, NaN, NaN]
minimum (123 evaluations): Box0.001237559632100242@[0.995733, -0.00289185, 0.00907339, 1.0083]
minimum (332 evaluations): Box0.0006967098199769907@[0.994128, -0.00878037, 0.00904372, 1.00651]
minimum (486 evaluations): Box0.0006938394122306331@[0.995316, -0.00844993, 0.00904372, 1.00677]
minimum (792 evaluations): Box0.0006370141609837249@[0.993946, -0.00896139, 0.00983786, 1.00617]
minimum (1008 evaluations): Box0.0005991275049755131@[0.994641, -0.00823588, 0.00994661, 1.00518]
Final minimum (1008 evaluations): Box0.0005991275049755131@[0.994641, -0.00823588, 0.00994661, 1.00518]
257 evaluations for initialization+sweeps, 751 for Quasi-Newton
Running fine step
Initial minimum (13 evaluations): Box0.0005962022301096294@[0.0, NaN, NaN, NaN, NaN, NaN]
minimum (259 evaluations): Box0.00020850778229556535@[-0.380967, 0.168179, 0.999481, -0.000751459, 0.0, 0.999737]
minimum (470 evaluations): Box0.00020299070820096983@[-0.348485, 0.178377, 0.999688, -0.000737133, 1.05085e-5, 0.999868]
Final minimum (526 evaluations): Box0.00017704258682729818@[-0.407661, 0.206002, 1.00009, -0.00138503, 9.28211e-6, 0.999961]
107 evaluations for initialization+sweeps, 419 for Quasi-Newton
QuadDIRECT tests with standard images: Test Failed
  Expression: all(abs.(comp.translation) .< vtol)
Stacktrace:
 [1] tfmtest(::CoordinateTransformations.AffineMap{Array{Float64,2},StaticArrays.SArray{Tuple{2},Float64,1,2}}, ::CoordinateTransformations.AffineMap{StaticArrays.SArray{Tuple{2,2},Float64,2,4},StaticArrays.SArray{Tuple{2},Float64,1,2}}) at /home/travis/.julia/v0.6/BlockRegistration/test/register_affine.jl:200
 [2] macro expansion at /home/travis/.julia/v0.6/BlockRegistration/test/register_affine.jl:251 [inlined]
 [3] macro expansion at ./test.jl:860 [inlined]
 [4] anonymous at ./<missing>:?
Test Summary:                         | Pass  Fail  Total
QuadDIRECT tests with standard images |   14     1     15
ERROR: LoadError: LoadError: Some tests did not pass: 14 passed, 1 failed, 0 errored, 0 broken.
while loading /home/travis/.julia/v0.6/BlockRegistration/test/register_affine.jl, in expression starting on line 207
while loading /home/travis/.julia/v0.6/BlockRegistration/test/runtests.jl, in expression starting on line 21

But the weird thing is that when I clone this branch into a fresh package directory on my laptop, build everything, and run the tests, all of the tests pass. I tried the same thing on our server, new package directory and same julia version, and it fails like Travis. So the only thing that differs between my laptop and our server / the Travis server should be hardware EDIT: Or BLAS version?. Any idea what could be causing that?

Here's my laptop output for that final test, which you can compare to the Travis output above. Notice that the optimiztion results are the same after 9 iterations, but they begin to diverge by 123 evaluations.

Running coarse step
Initial minimum (9 evaluations): Box0.0068573233718794225@[1.0, NaN, NaN, NaN]
minimum (123 evaluations): Box0.0012375596321004274@[0.995733, -0.00289185, 0.00907339, 1.0083]
minimum (322 evaluations): Box0.0008278534345169503@[0.995733, -0.0055504, 0.0090646, 1.00677]
minimum (507 evaluations): Box0.000633038972588164@[0.99552, -0.0091432, 0.0100532, 1.00532]
minimum (691 evaluations): Box0.0006001053804005178@[0.994574, -0.00883421, 0.0100532, 1.00524]
minimum (845 evaluations): Box0.0006001053804005178@[0.994574, -0.00883421, 0.0100532, 1.00524]
minimum (994 evaluations): Box0.0005976585560834609@[0.994728, -0.00854357, 0.00980402, 1.00535]
Final minimum (1044 evaluations): Box0.0005976585560834609@[0.994728, -0.00854357, 0.00980402, 1.00535]
349 evaluations for initialization+sweeps, 695 for Quasi-Newton
Running fine step
Initial minimum (13 evaluations): Box0.0005947779840206616@[0.0, NaN, NaN, NaN, NaN, NaN]
minimum (149 evaluations): Box0.00019508910815755403@[-0.378596, 0.172151, 0.999553, -0.00128453, 1.03279e-5, 0.999714]
Final minimum (149 evaluations): Box0.00019508910815755403@[-0.378596, 0.172151, 0.999553, -0.00128453, 1.03279e-5, 0.999714]
55 evaluations for initialization+sweeps, 94 for Quasi-Newton
Test Summary:                         | Pass  Total
QuadDIRECT tests with standard images |   15     15
Base.Test.DefaultTestSet("QuadDIRECT tests with standard images", Any[], 15, false)

And here's versioninfo:

Server (fails):

julia> versioninfo()
Julia Version 0.6.4
Commit 9d11f62bcb (2018-07-09 19:09 UTC)
Platform Info:
  OS: Linux (x86_64-pc-linux-gnu)
  CPU: Intel(R) Xeon(R) CPU E5-2680 0 @ 2.70GHz
  WORD_SIZE: 64
  BLAS: libopenblas (USE64BITINT DYNAMIC_ARCH NO_AFFINITY Sandybridge MAX_THREADS=16)
  LAPACK: libopenblas64_
  LIBM: libopenlibm
  LLVM: libLLVM-3.9.1 (ORCJIT, sandybridge)

Laptop (passes):

julia> versioninfo()
Julia Version 0.6.4
Commit 9d11f62bcb (2018-07-09 19:09 UTC)
Platform Info:
  OS: Linux (x86_64-linux-gnu)
  CPU: Intel(R) Core(TM) i7-7820HQ CPU @ 2.90GHz
  WORD_SIZE: 64
  BLAS: libopenblas (USE64BITINT DYNAMIC_ARCH NO_AFFINITY Haswell MAX_THREADS=16)
  LAPACK: libopenblas64_
  LIBM: libopenlibm
  LLVM: libLLVM-3.9.1 (ORCJIT, broadwell)

@Cody-G
Copy link
Contributor Author

Cody-G commented Aug 23, 2018

Also I just noticed #89 while running tests locally

@Cody-G
Copy link
Contributor Author

Cody-G commented Aug 23, 2018

Out of self-interest I've decided not to update to the new CoordinateTransforms for the julia0.6 branch. One reason is that it causes problems when loading older datasets saved with JLD.jl because reconstructed types don't match new types. So I'll close this PR for now, but we may want to grab this commit for future 0.7 work. The weird computer-specific test failure still stands and seems unrelated to this PR.

@timholy
Copy link
Member

timholy commented Aug 25, 2018

This is great debugging. My inclination is very much to "freeze" julia 0.6, so I think you're going in the right direction here. Is there stuff you need from me or are you figuring it all out?

@Cody-G
Copy link
Contributor Author

Cody-G commented Aug 25, 2018

Sounds good. #90 (first described in this thread) still baffles me, but it's not holding me back so I'm not working on it right now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants