Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updated lronaccal to use time-dependent darkfiles for dark calibration #4520

Merged
merged 14 commits into from
Jul 14, 2022

Conversation

victoronline
Copy link
Contributor

@victoronline victoronline commented Jun 8, 2021

Update to lronaccal to use time-dependent dark files for dark correction and use of specific dark files for images with exp code of zero.

Description
The dark average produced is dependant on which options are selected.

If the custom dark file option is selected, the provided dark file will be used.

If the image has an exposure code of zero, the nearest (time) dark files with exposure code of zero will be used.

If UseNearestTimeDarkFile option is selected, the dark file with the nearest time to the image will be used for dark correction.

If UseNearestTimeDarkFilePair option is selected, the pair of dark files that the image time lies between will be used, as long as the difference between dark files doesn't exceed 45 days. If a suitable pair is not found, the latest dark file taken before the image will be used.

The equation used:

pixel_dark_average =
avgDarkLine1_pixel * |darkfile1_time - time| + avgDarkLine2_pixel * |darkfile2_time - time|
/ (1.0 * ( |darkFile1_time - time| + |darkFile2_time - time| ) )

How Has This Been Tested?

This has been tested by the science team comparing photometric products using all dark file options and validated the results.

Types of changes

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Documentation change (update to the documentation; no code change)
  • Breaking change (fix or feature that would cause existing functionality to change) ?probably?

Checklist:

  • I have read and agree to abide by the Code of Conduct
  • I have read the CONTRIBUTING document.
  • My change requires a change to the documentation.
  • I have updated the documentation accordingly.
  • I have added tests to cover my changes.
  • All new and existing tests passed.
  • I have added myself to the .zenodo.json document.
  • I have added any user impacting changes to the CHANGELOG.md document.

Licensing

This project is mostly composed of free and unencumbered software released into the public domain, and we are unlikely to accept contributions that are not also released into the public domain. Somewhere near the top of each file should have these words:

This work is free and unencumbered software released into the public domain. In jurisdictions that recognize copyright laws, the author or authors of this software dedicate any and all copyright interest in the software to the public domain.

  • I dedicate any and all copyright interest in this software to the public domain. I make this dedication for the benefit of the public at large and to the detriment of my heirs and successors. I intend this dedication to be an overt act of relinquishment in perpetuity of all present and future rights to this software under copyright law.

…also allowed passthru of certain special pixel types instead of nulling them
@victoronline
Copy link
Contributor Author

What do we need to do to get this moving forward? Can I work on the missing pieces, like gtest, or do I have to wait for the review to be made?

1 similar comment
@victoronline
Copy link
Contributor Author

What do we need to do to get this moving forward? Can I work on the missing pieces, like gtest, or do I have to wait for the review to be made?

Copy link
Contributor

@jessemapel jessemapel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@victoronline Sorry I just realized I've had this review just sitting for a month now.

isis/src/lro/apps/lronaccal/lronaccal.xml Show resolved Hide resolved
<parameter name="DarkFileType">
<type>string</type>
<brief>
Custom Dark, Nearest Dark, Nearest Dark Pair?
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This description doesn't mean anything to me. Is there a short, one sentence, way to describe this?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated with better documentation.

isis/src/lro/apps/lronaccal/lronaccal.xml Show resolved Hide resolved
@jessemapel
Copy link
Contributor

You can start working on testing before review. You'll need a re-review after you push up tests too.

Updated documentation to provide more clarity regarding dark file correction/calibration options available to user.
@jessemapel
Copy link
Contributor

@victoronline ping me for a re-review when this is ready.

@victoronline
Copy link
Contributor Author

@jessemapel This should be ready now. Let me know if you see anything else.

@victoronline
Copy link
Contributor Author

@jessemapel , will this make it into this next sprint?

@victoronline
Copy link
Contributor Author

Can we get an update on this?

@jlaura
Copy link
Collaborator

jlaura commented Nov 13, 2021

@victoronline @jessemapel has not been in the office or able to look at this in the time available.

From my perspective, this is an incomplete PR still as I am not seeing any tests that are ready for review. Perhaps I missed them in the change list?

From Aug 23:

You can start working on testing before review. You'll need a re-review after you push up tests too.

@victoronline
Copy link
Contributor Author

I think Jesse was going to help with the tests and we've never done anything with the GTest suite. But if this is required, I can start working on it. I didn't catch his note about tests.

@jlaura
Copy link
Collaborator

jlaura commented Nov 18, 2021

@victoronline Sounds great! Docs on getting started writing tests are here: https://github.com/USGS-Astrogeology/ISIS3/wiki/Writing-ISIS3-Tests-Using-Gtest-and-Ctest

@jessemapel jessemapel self-assigned this Feb 11, 2022
@jessemapel
Copy link
Contributor

For testing, this needs a test for each option in DarkFileType.

@jlaura
Copy link
Collaborator

jlaura commented May 24, 2022

@victoronline What is the status on test writing for this?

@victoronline
Copy link
Contributor Author

I should be able to submit it later today. The tests have taken a bit of time. I just have to reduce the cubes used for testing and compare cube stats. The tools we used to validate were fx and cubediff and those aren't yet callable. Comparing stats will hopefully be ok for review.

@jlaura
Copy link
Collaborator

jlaura commented May 24, 2022

@victoronline sounds good. The reviewers will weigh in on the testing method. Looking forward to seeing the code in!

@victoronline
Copy link
Contributor Author

I have an input cube that I can’t crop. I’ll zip it and email to you Jay since I’m not sure who will review this. LRONACCAL won’t run on cropped cubes.

Copy link
Collaborator

@jlaura jlaura left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Requesting changes so that I can track the domain review on this. This request should not block the code re-review that needs to happen.

Copy link
Collaborator

@Kelvinrr Kelvinrr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just minor suggestions in line.

Comment on lines +33 to +47
void ResetGlobals();
void CopyCubeIntoVector(QString &fileString, vector<double> &data);
void ReadTextDataFile(QString &fileString, vector<double> &data);
void ReadTextDataFile(QString &fileString, vector<vector<double> > &data);
void Calibrate(Buffer &in, Buffer &out);
void RemoveMaskedOffset(Buffer &line);
void CorrectDark(Buffer &in);
void CorrectNonlinearity(Buffer &in);
void CorrectFlatfield(Buffer &in);
void RadiometricCalibration(Buffer &in);
void GetNearestDarkFile(QString fileString, QString &file);
void GetNearestDarkFilePair(QString &fileString, QString &file0, QString &file1);
void GetCalibrationDirectory(QString calibrationType, QString &calibrationDirectory);
void GetWeightedDarkAverages();
bool AllowedSpecialPixelType(double pixelValue);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Might be worth making these static, since they are added to libisis they risk potentially overlapping with something else in the ISIS namespace.

Comment on lines +77 to +86
double g_radianceLeft, g_radianceRight, g_iofLeft, g_iofRight, g_imgTime;
double g_exposure; // Exposure duration
double g_solarDistance; // average distance in [AU]

bool g_summed, g_masked, g_maskedLeftOnly, g_dark, g_nonlinear, g_flatfield, g_radiometric, g_iof, g_isLeftNac;
bool g_nearestDark, g_nearestDarkPair, g_customDark;
vector<int> g_maskedPixelsLeft, g_maskedPixelsRight;
vector<double> g_avgDarkLineCube0, g_avgDarkLineCube1, g_linearOffsetLine, g_flatfieldLine, g_darkTimes, g_weightedDarkTimeAvgs;
vector<vector<double> > g_linearityCoefficients;
Buffer *g_darkCube0, *g_darkCube1;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same with these, and move them with the rest of the func definitions.

vector <QString> darkFiles;

if(g_dark) {
QString darkFileType = ui.GetString("DARKFILETYPE");
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I realize some of this code hasn't changed from the original version, but FYI: when a parameter has a list of options, i.e., list of valid values = radio buttons, the GetString will give back the full value as specified in the xml, so the toupper isn't necessary

darkFileType = darkFileType.toUpper();
if (darkFileType == "CUSTOM") {
g_customDark = true;
ui.GetAsString("DARKFILE", darkFiles);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why use getAsString for a param defined to be a file name? This doesn't allow the user to use environment or ISIS variables (e.g., $LRO) unless you do the conversion to a FileName and expand it. Which is what GetFileName is doing for you.

}

if(g_nonlinear) {
offsetFile = ui.GetAsString("OFFSETFILE");
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This may have been easier if "internal default" was used along with an "if ui.WasEntered"

@victoronline
Copy link
Contributor Author

Thank you for the suggestions. I will keep those in mind in any future contributions. I hope these do not block the PR from merging. Jesse already reviewed the app code. The review was to be for the GTests and for the refactoring of the app to be callable.

@jlaura
Copy link
Collaborator

jlaura commented Jun 14, 2022

@victoronline I am explicitly requesting that changes be made to address the review comments above. If you have any questions, please let me know.

@Kelvinrr
Copy link
Collaborator

Kelvinrr commented Jun 14, 2022

The suggestion to make the globals static (therefore, local to the .cpp file) is part of the app conversion process. After the conversion those symbols are now being added to libisis rather than being localized into their own application code and therefore can cause problems down the line. Making them static easily solves the problem in the short term without having to re-engineer things to not use globals.

@victoronline
Copy link
Contributor Author

@jlaura, Please keep in mind, that we don't only focus on this software. Mission teams are not solely working on ISIS. We are currently far beyond the point where we are out of development bandwidth for this PR. I can try to submit the "making globals static" that Kelvin suggested later today, as he is correct, it is part of the app conversation process, which in itself is a requirement retroactively applied to this PR. But after that, we don't have any further available resources to throw at this currently. The retroactive adding of requirements and lack of clear, error-free documentation have made this process grossly inefficient. It is the reason I suggested a walk-through, to identify these "gotcha" late requirements, to avoid further delays. I expect to have suggestions in a code review. I welcome them as they are the purpose of a code review. However, adding changes to code that was already reviewed, and adding new requirements that didn't exist at the time the PR was submitted upon submitting fixes previously requested, causes unnecessary delays. This is not "best practice" nor the best use of development resources. This PR is now 1 year and 7 days old. Please advise. Thank you.

@scsides
Copy link
Contributor

scsides commented Jun 14, 2022

Those suggestions were just for future reference. They will not keep the PR from being merged

@jlaura
Copy link
Collaborator

jlaura commented Jun 14, 2022

The comments from @Kelvinrr need to be addressed.

I realize you are unhappy about the length that this PR has been open and I have not directly addressed this. On Nov. 18, 2021 I informed you, @victoronline, needed to write tests. 21 days ago, I pinged you, @victoronline, about the status of the tests. The vast majority of time that this PR has been open has been waiting on you to submit tests. Neither my team, nor I have control over the amount of time that it took to get the tests. We have been consistently clear that tests are needed to merge.

During the time that this PR has sat dormant, we have become aware of USGS requirements (a domain review) that need to be met on all code that we release publicly. As I indicated above, the requirement that the code released by USGS be of the highest quality is inline with existing best practices. The only difference between when this PR was original submitted and today is that the submitter needs to provide some sort of documentation that the review has been met. Had tests been written promptly around Nov. 18, 2021 we would not have been aware of that requirement. We do not grandfather in code submissions.

To be blunt - We will not merge code that does not pass the reviews as we are taking long term maintenance of this code on, not you. Additionally, more time and energy is being spent pushing us to merge this without the needed changes than it requires for you to actually make the changes. What exactly is your goal here?

@victoronline
Copy link
Contributor Author

@jlaura Thank you for your reply. When I initially submitted the issue and accompanying PR, having gtests written was not a requirement. The understanding at the time, like with the half-pixel fix, was that Jesse would do the gtests since the documentation was not up fully. I mentioned this on [on Nov 17, 2021]. Although notes in this PR do not capture all the comments regarding this, at that time, in our TC meetings, it was mentioned that support would be provided to help mission teams with gtests considering documentation was lacking, and it was new and vastly different from what was in place beforehand. Since then, the delay not only included us not providing gtests, but also delays that were not caused by us. Again, it was mentioned that you all would work with mission teams during this changing process. But just informing users what requirements are now and pointing to documentation, does not, in my opinion, mean working with folks.

To also be blunt, working with us to get this PR passed would not have taken you all much effort and would have been documented for future reference. Specifically addressing testing being promptly written, Assistance was repeatedly requested by mission teams via TC meetings. It was mentioned, in reply, multiple times, that support would be provided but, as far as I'm aware, only came in the form of GTest documentation. I requested assistance in the form of a hands-on walkthrough via additional funding during the LROC sprint, and one of your developers very rudely and unprofessionally refused to help, as he wasn't going to "hold our hands" and it's not what "he signed up for". The remaining developer did work with us, and in doing so, we both identified inaccuracies and mistakes in the documentation. It took us both working together, a while to complete one GTest for LRONACPHO. I don't understand the concept of expecting us to promptly write these tests when that wasn't possible with the assistance of one of your developers. I'm not trying to cause any issues but is in increasingly becoming more frustrating. I freely volunteer my time to help in any way I can with this project. You've mentioned that you are interested in working with mission teams under the spirit of collaboration to help move this project to a place where not only core developers are responsible for contributing, but the community as a whole undertakes that responsibility. I'm honestly trying to help with this process. I also understand that your team is tasked with supporting this project long-term. In my opinion, when making such large changes to a project, it is to your benefit to help your users along, not to just demand they "jump on or jump off". This requires flexibility. I don't think the current approach, in my opinion, really helps your cause of bringing folks on board to help with the development effort of this project, and making an example of this PR is not in anyone's best interest.

Addressing this comment: "Additionally, more time and energy is being spent pushing us to merge this without the needed changes than it requires for you to actually make the changes. ". I wish that were the case. I've requested the write-up you've asked for from Emerson. We will have to wait until his time frees up as we all are currently working on LROC work but also KPLO, which launches here shortly. I also have to take additional time away from other work to do these changes and also apply them to the other fork that we use internally since we are not yet at the bleeding edge version. This of course has to be retested. I also expect that something will be required of us regarding the test cube that is too large for your tests. So, no, I don't agree with this comment. I'm not asking for you to push without the changes but was asking for assistance with those changes because of the reasons I've mentioned.

I know that this discussion is not for this forum and I apologize for my part in causing it to go on this tangent. So I'll just speak about my goal, as you've requested.

My goal here is to try to get this PR merged for the LROC users that depend on this to do work related to LROC images. We, unfortunately, don't have any more time available for this PR. I am asking if you all can help push this through. I understand if this cannot be accommodated at this time. Please advise as to the next steps, if that's the case. Would you prefer I keep commenting so the bot won't close this PR until time becomes available later this year or next, or should I close it and open a new one if we have more time available. Again, your help is appreciated and I hope you all are enjoying the weather up there as it is now 200 degrees down here. I plan on doing some mountain biking up there soon. Thanks!

Copy link
Collaborator

@Kelvinrr Kelvinrr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Found other issues with the tests. Looks like it applies to all 3 tests.

QString iCubeFile = "data/lronaccal/input/M1333276014R.cub";
QString oCubeFile = outputDir.path() + "/out.default.cub";
QString oCubeCropFile = outputDir.path() + "/out.default.crop.cub";
QString tCubeFile = "data/lronaccal/truth/M1333276014R.default.crop.cub";
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is tcube doing here? it's not used for output but used to check histograms against anyways? Even it was output it should be going into a temp directory.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is comparing against truth cubes so no they aren't output cubes. But again, using as opposed to hard-coded values. But there wasn't guidance as to what would be preferred. I guess I thought it would be better to switch out truth cubes as changes to app occur and the code would not need to be changed in tests.

geometrically transformed (i.e. scaled, rotated, sheared, or
reflected) or cropped
*/
QString iCubeFile = "data/lronaccal/input/M1333276014R.cub";
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this file doesn't exist, only files from what looks like the old truth directory seems to be there?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is the large cube I sent to Jay. It doesn't exist because it was too large.

Comment on lines +69 to +72
EXPECT_NEAR(oCubeStats->Average(), tCubeStats->Average(), 0.001);
EXPECT_NEAR(oCubeStats->Sum(), tCubeStats->Sum(), 0.001);
EXPECT_NEAR(oCubeStats->ValidPixels(), tCubeStats->ValidPixels(), 0.001);
EXPECT_NEAR(oCubeStats->StandardDeviation(), tCubeStats->StandardDeviation(), 0.001);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I image the intention was to use the old truth cube as the baseline? Why not use expected values here and not have a truth cube at all?

Copy link
Contributor Author

@victoronline victoronline Jun 14, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is because the values we actually tested and validated are based on a not cropped cub. Honestly, I thought it was best to have it this way where the truth cub can be switched out if the app code changes, instead of leaving hard coded values that would need to be updated.

@victoronline
Copy link
Contributor Author

@Kelvinrr Addressed comments. It's the same way we did it in PR#4512. FYI

@jlaura
Copy link
Collaborator

jlaura commented Jun 14, 2022

@victoronline This is way too much off topic discussion for this issue. You are expressing fundamental concerns about the process that can either be discussed between LROC and the ASC contributors or between you and the ISIS TC. We have very different perspectives on the course of this PR and what is expected of folks, so following on discussion is absolutely warranted.

Thanks for continuing to work with the reviewers to ensure that this code is ready to merge once we have the needed domain review information.

@Kelvinrr
Copy link
Collaborator

Kelvinrr commented Jun 15, 2022

@victoronline I specifically recall you reduced those images pretty significantly and the tests passing.

I think the best course of action considering the sun lock in this PR is to just merge this I can do a follow up PR fixing the tests after either @victoronline or @jlaura sends me those test images. I'll either reduce them for the test or shove them into ISISDATA for now so the test doesn't just fail. I have my and @scsides's suggested changes locally already.

@jlaura
Copy link
Collaborator

jlaura commented Jun 15, 2022

@Kelvinrr The test images are attached above.

@Kelvinrr
Copy link
Collaborator

Kelvinrr commented Jun 15, 2022

@jlaura I got it on ISISTESTDATA at $ISISTESTDATA/isis/src/lro/apps/lronaccal/M1333276014R.cub so imho either @victoronline can simply update the path for his PR or we can merge this and make a separate PR with requested changes.

geometrically transformed (i.e. scaled, rotated, sheared, or
reflected) or cropped
*/
QString iCubeFile = "data/lronaccal/input/M1333276014R.cub";
Copy link
Collaborator

@Kelvinrr Kelvinrr Jun 16, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The bare minimum to get this merged passing code review is to change this to $ISISTESTDATA/isis/src/lro/apps/lronaccal/M1333276014R.cub @victoronline for all three tests.

@Kelvinrr
Copy link
Collaborator

@victoronline what's the status on this? Would you still be able to make the data path change soon?

geometrically transformed (i.e. scaled, rotated, sheared, or
reflected) or cropped
*/
QString iCubeFile = "data/lronaccal/input/M1333276014R.cub";
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@victoronline any updates on this? Again all we need is ctrl+c ctrl+v of $ISISTESTDATA/isis/src/lro/apps/lronaccal/M1333276014R.cub over these iCubeFile paths on the tests to pass review.

@victoronline
Copy link
Contributor Author

victoronline commented Jul 14, 2022

DOMAIN REVIEW:

The documentation I received on this is below. It's a more detailed explanation than exists in the description in the PR above. I don't know if this is what you are looking for because it doesn't go into detail as to how our scientists verified, analyzed, and validated their work. However, it does add more detail to the process used to arrive at the change made to the application. Please let me know if you need further information so I can request it from our science team. The actual equation is in the PR's description.

"Under the old version of LRONACCAL, we corrected all NAC left and right observations with a single pair of dark files derived early on during the LRO mission. We noticed a vertical striping in LROC images acquired later in the mission. We identified that these same stripes were also visible in NAC dark observations collected around the same time. We, therefore, developed a time-dependent dark correction to remove artifacts that have changed throughout the mission. To complete this step, we gathered all the dark observations acquired each month and averaged them into a single file representing the dark current response for that period.
When calibrating a NAC observation with the default settings, LRONACCAL will look for the nearest pair of dark files and compute a weighted average to derive the dark response at the time of image acquisition. We verified that this method works by looking at the statistical output of the dark response with the old correction and new implementation. We found that the new time-dependent dark correction successfully removes any vertical striping pattern from the image.
In addition, we also identified that images acquired with an exposure code of <10 had different dark current response characteristics. Since LROC only has a handful of observations with exposure codes 1-9, we only derived a correction for images with exposure codes of 0 and exposure codes >= 10. Therefore, images with exposure codes of 0 will use the exposure code 0 dark correction and the remaining images will use the standard correction derived from images with exposure codes >=10."

@jlaura
Copy link
Collaborator

jlaura commented Jul 14, 2022

@victoronline This looks good. Are you able to say if this was manually checked on a few images, tens of images, or more images to confirm that the change is the current best known implementation for getting data that your science team wants to see?

@victoronline
Copy link
Contributor Author

victoronline commented Jul 14, 2022

@jlaura I am not able to speak to this but can ask our science team for more information regarding the number of images used to arrive at this result. This is the way our science team is handling the calibration of the NAC to handle verticle striping. What I can say is that It is how we are processing images. If the response confirms that this is the best-known method, will there be additional proof required or just a confirmation?

@victoronline
Copy link
Contributor Author

@jlaura I may not be understanding your question. Are you also asking if the change is the best-known implementation or are you asking how many images we used to arrive at this conclusion?

@jlaura
Copy link
Collaborator

jlaura commented Jul 14, 2022

Just confirmation.

We verified that this method works by looking at the statistical output of the dark response with the old correction and new implementation. We found that the new time-dependent dark correction successfully removes any vertical striping pattern from the image.

I'm going to +1 this and remove my 'Changes requested' based on the above statement. I am curious if this has been broadly applied to the dataset or just to a handful of images.

Thanks for the PR.

@jlaura
Copy link
Collaborator

jlaura commented Jul 14, 2022

Also, this should be included in the 7.1 RC coming out in a few weeks and then officially release ~1 month after that.

@jlaura jlaura merged commit 9b10811 into DOI-USGS:dev Jul 14, 2022
@Kelvinrr Kelvinrr mentioned this pull request Jul 15, 2022
13 tasks
Kelvinrr pushed a commit to gsn9/ISIS3 that referenced this pull request Jul 18, 2022
DOI-USGS#4520)

* updated to use time dependent darks including darks for exp zero and also allowed passthru of certain special pixel types instead of nulling them

* Updated lronaccal.xml documentation

Updated documentation to provide more clarity regarding dark file correction/calibration options available to user.

* Updated formatting of text

* Added better documentation

* fixed formatting

* removed extra tabs

* Add radiance units label in lrowaccal output cube

* refactored code to allow callable for compatibility with gtest

* added gtests for 3 file types and made app callable

* Added changes for lronaccal to CHANGELOG.md

Added changes made to lronaccal and gtests to CHANGELOG.md.

Co-authored-by: Cordell Michaud <10409047+michaudcordell@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants