-
Notifications
You must be signed in to change notification settings - Fork 91
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WIP] add plot_anat_landmarks function #824
base: main
Are you sure you want to change the base?
Conversation
add plot_anat_landmarks function
Codecov Report
@@ Coverage Diff @@
## main #824 +/- ##
==========================================
- Coverage 95.20% 94.67% -0.54%
==========================================
Files 24 25 +1
Lines 3860 3887 +27
==========================================
+ Hits 3675 3680 +5
- Misses 185 207 +22
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. |
ok this one should be good for review. See rendered doc: https://4812-89170358-gh.circle-artifacts.com/0/dev/auto_examples/convert_mri_and_trans.html I will add a what's new entry at the end. |
examples/convert_mri_and_trans.py
Outdated
print_dir_tree(output_path) | ||
|
||
plot_anat_landmarks(t1w_bids_path, vmax=160) | ||
plt.suptitle('T1 MRI') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can this be moved into plot_anat_landmarks
? It would be nice if the figure came with a sensible title by default
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would also be good to briefly explain what vmax
is doing here?
examples/convert_mri_and_trans.py
Outdated
# landmarks Nasion, LPA, and RPA onto the brain image. For that, we can | ||
# extract the location of Nasion, LPA, and RPA from the MEG file, apply our | ||
# transformation matrix :code:`trans`, and plot the results. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wondering if we should start off this paragraph differently or put it into a new subsection. The first sentence made me think, "umm, but we just DID that??" until I got to the point where it says, " ... from the MEG file"
vmax : float | ||
Maximum colormap value. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we limit the acceptable range, e.g. to the interval [0, 255]?
mne_bids/viz.py
Outdated
fig : matplotlib.figure.Figure | None | ||
The figure object containing the plot. None if no landmarks | ||
are avaiable. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shouldn't we raise an exception if there are no landmarks?
mne_bids/viz.py
Outdated
) | ||
|
||
if show: | ||
plt.show() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should this better be fig.show()
?
Co-authored-by: Richard Höchenberger <richard.hoechenberger@gmail.com>
…into plot_landmarks
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we need this? Is nilearn's plot_anat
not sufficient?
@sappelhoff it's ok for me if we don't merge this. It was a very convenient tool for debugging recently and I see this as being potentially useful to check some existing dataset. but It's ok if we don't merge this. Maybe it's not useful in the end... |
Okay, I'll leave the decision to you and @hoechenberger - I am 50-50 |
I would say let's wait.... it also adds new dependencies on nilearn and
matplotlib which is not great....
… |
let me know what you think