-
-
Notifications
You must be signed in to change notification settings - Fork 125
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Properly handle coordinate systems on import #170
Conversation
By the way, I'll need to do some hackery to make sure this doesn't break serialisation on existing prefabs/scenes 😅 |
Hello @mlavik1 I have quickly looked on it and it is looking good. All my DICOM and NRRD datasets are now displayed correctly without mirroring. When you changed the overall parenting and scaling, you mentioned that now it can show "real" scales, does it mean that previously it was not the case? Because in previous version i noticed in some script, that pixel spacing (0028|0030 tag) was already taken into account, so i thought it already handles scales correctly. Was there some another correction added that i am missing out? Also when i changed the VolumeRenderedObject scale to 1 (to show real scale), the dataset is absolutely huge now, defaultly it has scale something like 0,003 on each axis so that is why it becomes so huge. Is this correct behaviour?
But the downscale only happens, if the dataset is larger than the VolumeContainer, so it fits into it right? So if it defaultly fits into it, the scale should be corresponding to real scanned values? Or am i missing something? Edit: I am probably mixing downscaling and downsampling, sorry :D Also i noticed, when you changed the planar cutout matrix, it is working fine. But same must be changed also for box and sphere cutout options, they are broken now. Also while we are at it, talking about slice renderer. When you spawn slice on any axis, and then you start rotating with the slice, it is all shifted and not correct. In this branch of yours, there is this temporary fix commit, that fixes this issue and slice can be rotated around fine and it is showing accurate values. Is there something wrong with this commit, that you didnt want to merge into main branch, or you simply forgot about this? |
Hi @SitronX ! And yes, you're right. The pixel spacing was already taken into account to make sure the relative scaling of each axis is correct (so the datasets get the correct shape), but then this line of code scales it down to make sure the whole dataset fits into a box with size (1,1,1). So if you load two datasets, their size relative to each other may be wrong, though the shape of each dataset will be correct. Maybe I should add a setting for disabling this normalised scale? And yes, by "downscaling" I don't mean "downsampling" 😁 It's just code for making the whole volume small enough to fit into a small box.
Oh, thanks for letting me know! I forgot about this :D I'll have a look at them then!
I honestly can't remember 😅 |
Hmm regarding the slicing plane.. I think part of the problem with the implementation is that the same shader is used for rendering the plane in the scene/game view AND in the UI. I should probably either create two separate shaders instead, or maybe even better: render the slice to a texture, and then just use that texture both places. |
Update: Fixed the remaining issues. I'll test some more and probably merge this in tomorrow if neither of us find any issues :) |
Hmm, i would expect that the real scaling would come close to the classic 1 Unity Unit=1 meter in real life. But as i said when i set it to 1, it is absolutely huge (and i dont think that the human chest is typically 1 kilometer long :D). I dont think that is right. Shouldnt the real scaling more reflect the real life? I bet we could use some dicom tag for this to properly scale it so it is 1 Unity Unit=1 meter in real life. The (1,1,1) local range would be still there but the object would just be scaled properly. But i am still suprised, because during my interaction with the library, everything was scaling as i would expect (like head scan was obviously smaller than the whole chest and relative proportions were kind of right where i would expect them). Am i misunderstanding something ? :D
Hmm, yes i see, the texture is probably better.
Haha, i actually noticed that issue a while ago and used that helpful branch for simple fix back then, only now when you mentioned the slice renderer it reminded me :D |
Hmm.. So the DICOM spacing is in mm. We now multiply the pixel spacing with the dimension to get the size (scale). So that would make a 1m long dataset 1000 units long.
Ah yes, I just tested with the dataset you sent me. It looks like the dimension of the datasets are all the same? I suppose they have masked out the different parts of the body from the main scan - rather than taking out the heart and scanning it individually 😅 However, if you try to load two datasets of different body parts where the whole body part covers the whole dataset, then the relative sizes may be wrong. You'll notice that the the largest axis of each of the boxes used to render the datasets will be exactly the same (they will all be 1.0 because of this). Here's an example: I don't have any such datasets available now, but if you load a full body scan of a human and a scan of a small lobster taken with a much smaller CT scanner, then you'd probably notice the problem more.. Anyway, 1000 units for 1m is a bit overkill 😅 So maybe I should change it to meters instead, and divide by 1000. But if you don't change the scale of the outer objects, are the sizes of your datasets still the same as before this PR? |
Oh right, maybe I should make sure it stays more similar to how it was before (with head pointing up). Thanks for the feedback!
|
…moved 90 degree rotation from outer object - do this in each importer instead.
So I've made the full-scale size smaller, and fixed the rotation so that it is the same as it used to be (at least on all datasets I've tested with): 0e5c3e8 |
Merged this, and will do the openDICOM fix next. Feel free to let me know you have any other issues. I'm still not going to make a new release until I have a few more changes in, and have done some more testing. |
Hi @SitronX
Here is my first PR for fixing the coordinate systems. I still need to test more thoroughly, and maybe clean up a few things, but feel free to try it out on your datasets when you have time!
Changes:
Explanation of coordinate system transformation:
Converting from LPS to Unity's coordinate system can be done by the matrix
-1 0 0
0 0 -1
0 1 0
Which is basically the same as scaling by (-1, 0, 0) and rotating by (90, 0, 0).
Changes to the GameObject transforms
We have the following hierarchy:
VolumeRenderedObject
-> VolumeContainer
Previously the VolumeContainer would have a scale of 1 and no rotation, and we would apply all scaling to the parent VolumeRenderedObject.
Now the scaling is done in the VolumeContainer instead, so if you want "real" coordinates you can simply set the scale of the VolumeRenderedObject to 1. This is useful if you want to import and compare several related datasets (received some questions about that before..). So you can import two datasets and their relative sizes to each other should be correct. But by default we downscale the VolumeRenderedObject to a "normalised" size, so that it doesn't become too big.
#166