Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MRG: add resampling and smoothing functions #255

Merged
merged 8 commits into from
Feb 25, 2016
Merged

Conversation

matthew-brett
Copy link
Member

This pull request adds some simple processing functions when scipy
is available. The functions are:

  • resample_from_to - resample first image into space of second
  • resample_to_output - resample image into output (world) space
  • smooth_image - smooth image over voxel axes

I also put in the utility functions converting between FWHM and sigma.

@matthew-brett
Copy link
Member Author

I would very much like a review here - anyone?

* resampling
* converting sd to and from FWHM

Smoothing and resampling routines need scipy
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should any docs be updated to mention this? Not sure what nibabel doc building / standards are like.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would like to leave this out of the docs for now, to avoid people hitting the problem with unspecified axes.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You mean leave the resampling functions out entirely, until support is more complete? Fine by me.

@larsoner
Copy link
Contributor

Other than my comments and the fact that this needs a rebase LGTM. You might want to have someone else take a look, though, since I'm not yet too familiar with these operations.

@matthew-brett
Copy link
Member Author

My remaining problem with this PR is that it assumes that the fourth dimension of the image is special, or, more, it assumes the first three dimensions are spatial. This isn't true for MINC for example, and I haven't decided yet whether this should be a requirement for nibabel images.

  • For: matches the nifti model, simplifies API for anyone used to making that assumption
  • Against: users of MINC will have their images rearranged in ways they might not expect.

I have thought about adding an axis_types field to the image class, where nifti-like images would always return `"space", "space", "space", "time" for 4D images. In that case this PR would need to use that information in order to know which axes to treat as spatial.

@larsoner
Copy link
Contributor

That seems reasonable. Might be even more useful to have something like ('R', 'A', 'S', 'time') or so, so that the orientation can be inferred as well.

@larsoner
Copy link
Contributor

@matthew-brett do you think you'll have time to take these up again soon? We don't have to if there are other priorities, but my memory traces from the visualization PR / branch get increasingly lost in my head as time passes :)

@matthew-brett
Copy link
Member Author

Thanks for the reminder. Still stalled for lack of courage for thinking about the axis_types API.

I guess it is time for a BIAP : https://github.com/nipy/nibabel/wiki

@bcipolli
Copy link
Contributor

I believe this was covered here: https://github.com/nipy/nibabel/wiki/BIAP6

I think the general solution is best: https://github.com/nipy/nibabel/wiki/BIAP6#general-solution-associating-axes-and-labels . Seems there are a number of other image formats people are interested to integrate; this solution seems to give the most flexibility for dealing with implementation-specific details of new image formats.

@nibotmi
Copy link
Contributor

nibotmi commented Feb 7, 2016

☔ The latest upstream changes (presumably #402) made this pull request unmergeable. Please resolve the merge conflicts.

Change append_diag to use AffineError
Moving the parameters to a function allows me to use the same parameters
for testing the resample_to_output function (coming soon to a repo near
you).
Add:

* resampling images to images / mapped voxels
* resampling images to output space
* smoothing over voxel axes
* FWHM / sigma conversion
Fix docstring giving wrong explanation for ND interpolation.
@matthew-brett
Copy link
Member Author

This PR is old now, and I think it's time to merge and see what folks make of it.

Any other comments here before I go ahead?

The axis stuff should get done before a release, surely, but let's get this in for now.

@larsoner
Copy link
Contributor

Other than the idea of comparing to the output of another program for validation LGTM. That could be a separate PR, too, if necessary.

@matthew-brett
Copy link
Member Author

Eric - yes - sorry - that is a good idea. For convenience, I'm planning to use the anatomical and function from here: https://github.com/nipy/nipy/tree/master/nipy/testing - and reslice them using using an arbitrary transformation using SPM. Sound reasonable?

@larsoner
Copy link
Contributor

larsoner commented Feb 16, 2016 via email

@matthew-brett
Copy link
Member Author

Hmm - this is proving unpleasant. If you're interested, you can have a look at the failing test on this branch here : https://github.com/matthew-brett/nibabel/tree/tmp-processing .

There are some voxels that are way off on the boundary of the image. I have a vague memory that Omar Ocuegueda hit this from the scipy resample routines. I'm afraid this may need some serious thought.

@matthew-brett
Copy link
Member Author

@matthew-brett
Copy link
Member Author

The discussion I was thinking of is:

dipy/dipy#489

scipy/scipy#4075

@matthew-brett
Copy link
Member Author

We're getting almost exactly the same as nilearn, not surprisingly, as they are using scipy.ndimage.affine_transform as well.

@matthew-brett
Copy link
Member Author

It is a problem of interpolating outside the image boundary, but in fact I think this isn't a serious problem for us (though it makes it harder to test properly).

Here's the script that reproduces the problem: https://gist.github.com/77a3dd12998755ad3259

The points giving the big differences have the following coordinates in the resampling-from image:

In [3]: np.set_printoptions(suppress=True, precision=4)

In [4]: bad_points_transformed
Out[4]: 
array([[ -0.0318,  37.3074,  10.817 ],
       [ 27.2729,  40.047 ,   5.2542]])

In [5]: from_img.shape
Out[5]: (33, 41, 25)

In [6]: spnd.map_coordinates(from_img.dataobj, bad_points_transformed.T)
Out[6]: array([ 0.,  0.])

So, both are sampling just outside the volume (<0, >40) and therefore the sampling is returning 0.

Just setting these values to 0, 40, brings these back to the boundary, and gives real image values:

In [7]: less_bad_points = bad_points_transformed.copy()

In [8]: less_bad_points[0, 0] = 0

In [9]: less_bad_points[1, 1] = 40

In [10]: less_bad_points
Out[10]: 
array([[  0.    ,  37.3074,  10.817 ],
       [ 27.2729,  40.    ,   5.2542]])

In [11]: spnd.map_coordinates(from_img.dataobj, less_bad_points.T)
Out[11]: array([ 8704.9648,  8384.2805])

Even a tiny change flips the value to outside the volume, and resampled value of 0:

In [12]: less_bad_points[0, 0] = -0.00001

In [13]: spnd.map_coordinates(from_img.dataobj, less_bad_points.T)
Out[13]: array([    0.    ,  8384.2805])

Now going over to SPM - we load the points (1-based indexing corrected):

>> load('bad_transformed.mat')
>> bad_transformed

bad_transformed =

    0.9682   38.3074   11.8170
   28.2729   41.0470    6.2542

We get the SPM equivalent of an image object, and sample at these points:

>> V = spm_vol('anat_moved.nii');
>> spm_sample_vol(V, bad_transformed(:, 1), bad_transformed(:, 2), bad_transformed(:, 3), 3)

ans =

   1.0e+03 *

    8.5141
    8.1095

This is the difference between the nibabel and SPM resampling.

In fact, what is happening is that the resampling goes to zero when any of the coordinates are > 0.05 voxel units outside the image border:

>> worse_transformed = bad_transformed;
>> worse_transformed(1, 1) = 0.95;
>> worse_transformed

worse_transformed =

    0.9500   38.3074   11.8170
   28.2729   41.0470    6.2542

>> spm_sample_vol(V, worse_transformed(:, 1), worse_transformed(:, 2), worse_transformed(:, 3), 3)

ans =

   1.0e+03 *

    8.3741
    8.1095

>> worse_transformed(1, 1) = 0.94;                                                                
>> spm_sample_vol(V, worse_transformed(:, 1), worse_transformed(:, 2), worse_transformed(:, 3), 3)

ans =

   1.0e+03 *

         0
    8.1095

>> worse_transformed(2, 2) = 41.05;                                                               
>> spm_sample_vol(V, worse_transformed(:, 1), worse_transformed(:, 2), worse_transformed(:, 3), 3)

ans =

   1.0e+03 *

         0
    8.0864

>> worse_transformed(2, 2) = 41.06;                                                               
>> spm_sample_vol(V, worse_transformed(:, 1), worse_transformed(:, 2), worse_transformed(:, 3), 3)

ans =

     0
     0

I think this comes from these lines of the SPM code: https://bitbucket.org/matthewbrett/spm-versions/src/ac51a22a558771288241e54b48d3e8420d2585ca/spm12/src/spm_vol_utils.c?at=master&fileviewer=file-view-default#spm_vol_utils.c-403

The code is checking if the voxel is > TINY units outside the boundary, where TINY turns out to be 0.05 (https://bitbucket.org/matthewbrett/spm-versions/src/ac51a22a558771288241e54b48d3e8420d2585ca/spm12/src/spm_vol_utils.c?at=master&fileviewer=file-view-default#spm_vol_utils.c-6)

So, I think these differences are an artefact of how SPM does its interpolation, and we're probably OK.

@larsoner
Copy link
Contributor

Looks like you have a good handle on the differences in implementation. Maybe it's good enough note in the doc that there can be serious differences in the results at the boundaries?

@matthew-brett
Copy link
Member Author

Thinking about it with the benefit of a bit more sleep, I think I will use the calculations above to allow the differences in the points that go very slightly off the edge, PR soon, finishing this off, probably Wednesday.

Allow user to specify a scalar meaning that voxel sizes for each
dimension of the image should be the same.
Allow rtol, atol for wrapper around ``allclose``.
@matthew-brett
Copy link
Member Author

OK - I added the tests against SPM resampling, ready for review.

Build some test images by resampling with SPM, and test our resampling
against SPM's resampling.
@larsoner
Copy link
Contributor

LGTM

@matthew-brett
Copy link
Member Author

Thanks for taking a look. There's an unsolved problem with resampling of images with different dimensions, noted in #411, but let's merge this one now. On to #404 ...

matthew-brett added a commit that referenced this pull request Feb 25, 2016
MRG: add resampling and smoothing functions

This pull request adds some simple processing functions when scipy
is available. The functions are:

* resample_from_to - resample first image into space of second
* resample_to_output - resample image into output (world) space
* smooth_image - smooth image over voxel axes

I also put in the utility functions converting between FWHM and sigma.
@matthew-brett matthew-brett merged commit 798d4a5 into master Feb 25, 2016
grlee77 pushed a commit to grlee77/nibabel that referenced this pull request Mar 15, 2016
MRG: add resampling and smoothing functions

This pull request adds some simple processing functions when scipy
is available. The functions are:

* resample_from_to - resample first image into space of second
* resample_to_output - resample image into output (world) space
* smooth_image - smooth image over voxel axes

I also put in the utility functions converting between FWHM and sigma.
@effigies effigies deleted the add-processing branch April 25, 2019 17:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants