Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BUG: Fix parsing of MANGA cube (try 2) #3119

Merged
merged 2 commits into from
Jul 31, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions CHANGES.rst
Original file line number Diff line number Diff line change
Expand Up @@ -71,6 +71,9 @@ Cubeviz

- Spectral Extraction: renamed ``collapse_to_spectrum(...)`` to ``extract(...)``. [#2859]

- Generic FITS parsing now goes through ``specutils`` loaders first, if possible.
If a ``specutils`` loader is used, uncertainty is converted to standard deviation type. [#3119]

- Custom Spectrum1D writer format ``jdaviz-cube`` is removed. Use ``wcs1d-fits`` from
``specutils`` instead. [#2094]

Expand Down Expand Up @@ -112,6 +115,11 @@ Cubeviz

- Mouse over coordinates now responds to the selected surface brightness unit. [#2931]

- Fixed MaNGA cube loading. Uncertainty type is also handled properly now. [#3119]

- Fixed spectral axis value display in Markers plugin. Previously, it failed to display
very small values, resulting in zeroes. [#3119]

Imviz
^^^^^

Expand Down
41 changes: 26 additions & 15 deletions jdaviz/configs/cubeviz/plugins/parsers.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
import numpy as np
from astropy import units as u
from astropy.io import fits

from astropy.nddata import StdDevUncertainty
from astropy.time import Time
from astropy.wcs import WCS
from specutils import Spectrum1D
Expand Down Expand Up @@ -68,11 +68,18 @@ def parse_data(app, file_obj, data_type=None, data_label=None,
# generic enough to work with other file types (e.g. ASDF). For now, this
# supports MaNGA and JWST data.
if isinstance(file_obj, fits.hdu.hdulist.HDUList):
_parse_hdulist(
app, file_obj, file_name=data_label,
flux_viewer_reference_name=flux_viewer_reference_name,
uncert_viewer_reference_name=uncert_viewer_reference_name
)
try:
_parse_spectrum1d_3d(
app, Spectrum1D.read(file_obj), data_label=data_label,
flux_viewer_reference_name=flux_viewer_reference_name,
uncert_viewer_reference_name=uncert_viewer_reference_name
)
except Exception: # nosec
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we do anything more specific than general exception that won't result in catching generic typos, etc?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@rosteen , any idea? I don't think there is a way to know if any arbitrary input is going to succeed/fail the specutils I/O registry until you actually do it.

Copy link
Member

@kecnry kecnry Jul 31, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

but is there even a specific extension exception we can catch instead of the catch-all?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you mean exception, I purposely make it general here to be backward compatible. If specutils cannot parse for whatever reason, it goes back to the old way (until we can get rid of it because I think we should).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"But is there a cleaner solution?"

Yes, there is. We can write a custom parser for each format that specutils cannot read. But I don't have that list or time.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My concern is just that some random bug or typo could result in the except always being triggered and we have no way of knowing that or noticing it in the tests. Maybe for now just a snackbar if the except is entered and then could check for that snackbar message in test coverage could do?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I do not see how that is worse than what we already have. Currently, it goes through our own HDU parser anyway. With this change, it would still do that and then still crash like it would currently if both specutils and HDU parser cannot handle it. The only difference introduced by this PR is that it tries specutils first. A snackbar message before fallback would just be confusing if it ends up succeeding later.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

agreed from the user-perspective, I'm more concerned about testing and ending up with stale code without noticing (or just not clear which path a file took). But as long as there is a plan to consolidate these later, I guess this is fine for now. Thanks!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think @rosteen is going to end up refactoring this for his specutils 2.0 work?

_parse_hdulist(
app, file_obj, file_name=data_label,
flux_viewer_reference_name=flux_viewer_reference_name,
uncert_viewer_reference_name=uncert_viewer_reference_name
)
elif isinstance(file_obj, str):
if file_obj.lower().endswith('.gif'): # pragma: no cover
_parse_gif(app, file_obj, data_label,
Expand Down Expand Up @@ -129,11 +136,18 @@ def parse_data(app, file_obj, data_type=None, data_label=None,
flux_viewer_reference_name=flux_viewer_reference_name,
)
else:
_parse_hdulist(
app, hdulist, file_name=data_label or file_name,
flux_viewer_reference_name=flux_viewer_reference_name,
uncert_viewer_reference_name=uncert_viewer_reference_name
)
try:
_parse_spectrum1d_3d(
app, Spectrum1D.read(hdulist), data_label=data_label or file_name,
flux_viewer_reference_name=flux_viewer_reference_name,
uncert_viewer_reference_name=uncert_viewer_reference_name
)
except Exception: # nosec
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same as above

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Replied at #3119 (comment)

_parse_hdulist(
app, hdulist, file_name=data_label or file_name,
flux_viewer_reference_name=flux_viewer_reference_name,
uncert_viewer_reference_name=uncert_viewer_reference_name
)

# If the data types are custom data objects, use explicit parsers. Note
# that this relies on the glue-astronomy machinery to turn the data object
Expand Down Expand Up @@ -412,10 +426,7 @@ def _parse_spectrum1d_3d(app, file_obj, data_label=None,
if attr == "mask":
flux = val << u.dimensionless_unscaled # DQ flags have no unit
elif attr == "uncertainty":
if hasattr(val, "array"):
flux = u.Quantity(val.array, file_obj.flux.unit)
else:
continue
flux = val.represent_as(StdDevUncertainty).quantity
else:
flux = val

Expand Down
12 changes: 6 additions & 6 deletions jdaviz/core/template_mixin.py
Original file line number Diff line number Diff line change
Expand Up @@ -4546,24 +4546,24 @@ def float_precision(column, item):
return f"{item:0.3f}"
elif column in ('xcenter', 'ycenter'):
return f"{item:0.1f}"
elif column in ('sum', ):
elif column in ('sum', 'spectral_axis'):
return f"{item:.3e}"
else:
return f"{item:0.5f}"

if isinstance(item, SkyCoord):
return item.to_string('hmsdms', precision=4)
if isinstance(item, u.Quantity) and not np.isnan(item):
elif isinstance(item, u.Quantity) and not np.isnan(item):
return f"{float_precision(column, item.value)} {item.unit.to_string()}"

if hasattr(item, 'to_string'):
elif hasattr(item, 'to_string'):
return item.to_string()
if isinstance(item, float) and np.isnan(item):
elif isinstance(item, float) and np.isnan(item):
return ''
if isinstance(item, tuple) and np.all([np.isnan(i) for i in item]):
elif isinstance(item, tuple) and np.all([np.isnan(i) for i in item]):
return ''

if isinstance(item, float):
elif isinstance(item, float):
return float_precision(column, item)
elif isinstance(item, (list, tuple)):
return [float_precision(column, i) if isinstance(i, float) else i for i in item]
Expand Down