-
Notifications
You must be signed in to change notification settings - Fork 95
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Print optimal number of maPCA components and plot optimization curves #839
Changes from 11 commits
f6f5438
524903e
2f004e8
3f162cc
0348d7c
a199ecf
5e48e81
2cd50ec
0efdf7a
66cad57
07b5d2c
e361e16
81259cb
8c96560
e4cc27d
09aa092
52ffad8
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -29,7 +29,7 @@ | |
|
||
REQUIRES = [ | ||
"bokeh<2.3.0", | ||
"mapca~=0.0.1", | ||
"mapca>=0.0.2", | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Unfortunately we'll need to resolve the conflict with main where we've moved everything into a different setup organization (in particular, |
||
"matplotlib", | ||
"nibabel>=2.5.1", | ||
"nilearn>=0.7", | ||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -288,3 +288,144 @@ def comp_figures(ts, mask, comptable, mmix, io_generator, png_cmap): | |
compplot_name = os.path.join(io_generator.out_dir, "figures", plot_name) | ||
plt.savefig(compplot_name) | ||
plt.close() | ||
|
||
|
||
def pca_results(criteria, n_components, all_varex, io_generator): | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This function has a large amount of repetition. Could we break it into smaller functions, parametrized perhaps by something like the input data and the label only, and if matplotlib requires this, the figure itself? |
||
""" | ||
Plot the PCA optimization curve for each criteria, and the variance explained curve. | ||
|
||
Parameters | ||
---------- | ||
criteria : array-like | ||
AIC, KIC, and MDL optimization values for increasing number of components. | ||
n_components : array-like | ||
Number of optimal components given by each criteria. | ||
io_generator : object | ||
An object containing all the information needed to generate the output. | ||
""" | ||
|
||
# Plot the PCA optimization curve for each criteria | ||
plt.figure(figsize=(10, 9)) | ||
plt.title("PCA Criteria") | ||
plt.xlabel("PCA components") | ||
plt.ylabel("Arbitrary Units") | ||
|
||
# AIC curve | ||
plt.plot(criteria[0, :], color="tab:blue", label="AIC") | ||
# KIC curve | ||
plt.plot(criteria[1, :], color="tab:orange", label="KIC") | ||
# MDL curve | ||
plt.plot(criteria[2, :], color="tab:green", label="MDL") | ||
|
||
# Vertical line depicting the optimal number of components given by AIC | ||
plt.vlines( | ||
n_components[0], | ||
ymin=np.min(criteria), | ||
ymax=np.max(criteria), | ||
color="tab:blue", | ||
linestyles="dashed", | ||
) | ||
# Vertical line depicting the optimal number of components given by KIC | ||
plt.vlines( | ||
n_components[1], | ||
ymin=np.min(criteria), | ||
ymax=np.max(criteria), | ||
color="tab:orange", | ||
linestyles="dashed", | ||
) | ||
# Vertical line depicting the optimal number of components given by MDL | ||
plt.vlines( | ||
n_components[2], | ||
ymin=np.min(criteria), | ||
ymax=np.max(criteria), | ||
color="tab:green", | ||
linestyles="dashed", | ||
) | ||
# Vertical line depicting the optimal number of components for 90% variance explained | ||
plt.vlines( | ||
n_components[3], | ||
ymin=np.min(criteria), | ||
ymax=np.max(criteria), | ||
color="tab:red", | ||
linestyles="dashed", | ||
label="90% varexp", | ||
) | ||
# Vertical line depicting the optimal number of components for 95% variance explained | ||
plt.vlines( | ||
n_components[4], | ||
ymin=np.min(criteria), | ||
ymax=np.max(criteria), | ||
color="tab:purple", | ||
linestyles="dashed", | ||
label="95% varexp", | ||
) | ||
|
||
plt.legend() | ||
|
||
# Save the plot | ||
plot_name = "pca_criteria.png" | ||
pca_criteria_name = os.path.join(io_generator.out_dir, "figures", plot_name) | ||
plt.savefig(pca_criteria_name) | ||
plt.close() | ||
|
||
# Plot the variance explained curve | ||
plt.figure(figsize=(10, 9)) | ||
plt.title("Variance Explained") | ||
plt.xlabel("PCA components") | ||
plt.ylabel("Variance Explained") | ||
|
||
plt.plot(all_varex, color="black", label="Variance Explained") | ||
|
||
# Vertical line depicting the optimal number of components given by AIC | ||
plt.vlines( | ||
n_components[0], | ||
ymin=np.min(criteria), | ||
ymax=np.max(criteria), | ||
color="tab:blue", | ||
linestyles="dashed", | ||
label="AIC", | ||
) | ||
# Vertical line depicting the optimal number of components given by KIC | ||
plt.vlines( | ||
n_components[1], | ||
ymin=np.min(criteria), | ||
ymax=np.max(criteria), | ||
color="tab:orange", | ||
linestyles="dashed", | ||
label="KIC", | ||
) | ||
# Vertical line depicting the optimal number of components given by MDL | ||
plt.vlines( | ||
n_components[2], | ||
ymin=np.min(criteria), | ||
ymax=np.max(criteria), | ||
color="tab:green", | ||
linestyles="dashed", | ||
label="MDL", | ||
) | ||
# Vertical line depicting the optimal number of components for 90% variance explained | ||
plt.vlines( | ||
n_components[3], | ||
ymin=np.min(criteria), | ||
ymax=np.max(criteria), | ||
color="tab:red", | ||
linestyles="dashed", | ||
label="90% varexp", | ||
) | ||
# Vertical line depicting the optimal number of components for 95% variance explained | ||
plt.vlines( | ||
n_components[4], | ||
ymin=np.min(criteria), | ||
ymax=np.max(criteria), | ||
color="tab:purple", | ||
linestyles="dashed", | ||
label="95% varexp", | ||
) | ||
|
||
plt.legend() | ||
|
||
# Save the plot | ||
plot_name = "pca_variance_explained.png" | ||
pca_variance_explained_name = os.path.join(io_generator.out_dir, "figures", plot_name) | ||
plt.savefig(pca_variance_explained_name) | ||
plt.close() |
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -91,3 +91,4 @@ figures/comp_061.png | |
figures/comp_062.png | ||
figures/comp_063.png | ||
figures/comp_064.png | ||
figures/pca_criteria.png |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks like we're accessing private components of the object. Could we either link to some documentation about the returned object or elaborate on what we're accessing here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The trailing underscore indicates that the attribute is estimated from the data (via
fit
), rather than that it's private. We adopted this convention from scikit-learn.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, that makes sense. Sorry had a momentary flip in my brain, where I put the underscore on the wrong side. I do think it's worth a quick reference to the docs.