-
Notifications
You must be signed in to change notification settings - Fork 168
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Updated lronaccal to use time-dependent darkfiles for dark calibration #4520
Conversation
…also allowed passthru of certain special pixel types instead of nulling them
What do we need to do to get this moving forward? Can I work on the missing pieces, like gtest, or do I have to wait for the review to be made? |
1 similar comment
What do we need to do to get this moving forward? Can I work on the missing pieces, like gtest, or do I have to wait for the review to be made? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@victoronline Sorry I just realized I've had this review just sitting for a month now.
<parameter name="DarkFileType"> | ||
<type>string</type> | ||
<brief> | ||
Custom Dark, Nearest Dark, Nearest Dark Pair? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This description doesn't mean anything to me. Is there a short, one sentence, way to describe this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Updated with better documentation.
You can start working on testing before review. You'll need a re-review after you push up tests too. |
Updated documentation to provide more clarity regarding dark file correction/calibration options available to user.
@victoronline ping me for a re-review when this is ready. |
@jessemapel This should be ready now. Let me know if you see anything else. |
@jessemapel , will this make it into this next sprint? |
Can we get an update on this? |
@victoronline @jessemapel has not been in the office or able to look at this in the time available. From my perspective, this is an incomplete PR still as I am not seeing any tests that are ready for review. Perhaps I missed them in the change list? From Aug 23:
|
I think Jesse was going to help with the tests and we've never done anything with the GTest suite. But if this is required, I can start working on it. I didn't catch his note about tests. |
@victoronline Sounds great! Docs on getting started writing tests are here: https://github.com/USGS-Astrogeology/ISIS3/wiki/Writing-ISIS3-Tests-Using-Gtest-and-Ctest |
For testing, this needs a test for each option in DarkFileType. |
Add radiance units label in lrowaccal output cube
@victoronline What is the status on test writing for this? |
I should be able to submit it later today. The tests have taken a bit of time. I just have to reduce the cubes used for testing and compare cube stats. The tools we used to validate were fx and cubediff and those aren't yet callable. Comparing stats will hopefully be ok for review. |
@victoronline sounds good. The reviewers will weigh in on the testing method. Looking forward to seeing the code in! |
I have an input cube that I can’t crop. I’ll zip it and email to you Jay since I’m not sure who will review this. LRONACCAL won’t run on cropped cubes. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Requesting changes so that I can track the domain review on this. This request should not block the code re-review that needs to happen.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just minor suggestions in line.
void ResetGlobals(); | ||
void CopyCubeIntoVector(QString &fileString, vector<double> &data); | ||
void ReadTextDataFile(QString &fileString, vector<double> &data); | ||
void ReadTextDataFile(QString &fileString, vector<vector<double> > &data); | ||
void Calibrate(Buffer &in, Buffer &out); | ||
void RemoveMaskedOffset(Buffer &line); | ||
void CorrectDark(Buffer &in); | ||
void CorrectNonlinearity(Buffer &in); | ||
void CorrectFlatfield(Buffer &in); | ||
void RadiometricCalibration(Buffer &in); | ||
void GetNearestDarkFile(QString fileString, QString &file); | ||
void GetNearestDarkFilePair(QString &fileString, QString &file0, QString &file1); | ||
void GetCalibrationDirectory(QString calibrationType, QString &calibrationDirectory); | ||
void GetWeightedDarkAverages(); | ||
bool AllowedSpecialPixelType(double pixelValue); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Might be worth making these static, since they are added to libisis they risk potentially overlapping with something else in the ISIS namespace.
double g_radianceLeft, g_radianceRight, g_iofLeft, g_iofRight, g_imgTime; | ||
double g_exposure; // Exposure duration | ||
double g_solarDistance; // average distance in [AU] | ||
|
||
bool g_summed, g_masked, g_maskedLeftOnly, g_dark, g_nonlinear, g_flatfield, g_radiometric, g_iof, g_isLeftNac; | ||
bool g_nearestDark, g_nearestDarkPair, g_customDark; | ||
vector<int> g_maskedPixelsLeft, g_maskedPixelsRight; | ||
vector<double> g_avgDarkLineCube0, g_avgDarkLineCube1, g_linearOffsetLine, g_flatfieldLine, g_darkTimes, g_weightedDarkTimeAvgs; | ||
vector<vector<double> > g_linearityCoefficients; | ||
Buffer *g_darkCube0, *g_darkCube1; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same with these, and move them with the rest of the func definitions.
vector <QString> darkFiles; | ||
|
||
if(g_dark) { | ||
QString darkFileType = ui.GetString("DARKFILETYPE"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I realize some of this code hasn't changed from the original version, but FYI: when a parameter has a list of options, i.e., list of valid values = radio buttons, the GetString will give back the full value as specified in the xml, so the toupper isn't necessary
darkFileType = darkFileType.toUpper(); | ||
if (darkFileType == "CUSTOM") { | ||
g_customDark = true; | ||
ui.GetAsString("DARKFILE", darkFiles); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why use getAsString for a param defined to be a file name? This doesn't allow the user to use environment or ISIS variables (e.g., $LRO) unless you do the conversion to a FileName and expand it. Which is what GetFileName is doing for you.
} | ||
|
||
if(g_nonlinear) { | ||
offsetFile = ui.GetAsString("OFFSETFILE"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This may have been easier if "internal default" was used along with an "if ui.WasEntered"
Thank you for the suggestions. I will keep those in mind in any future contributions. I hope these do not block the PR from merging. Jesse already reviewed the app code. The review was to be for the GTests and for the refactoring of the app to be callable. |
@victoronline I am explicitly requesting that changes be made to address the review comments above. If you have any questions, please let me know. |
The suggestion to make the globals static (therefore, local to the .cpp file) is part of the app conversion process. After the conversion those symbols are now being added to libisis rather than being localized into their own application code and therefore can cause problems down the line. Making them static easily solves the problem in the short term without having to re-engineer things to not use globals. |
@jlaura, Please keep in mind, that we don't only focus on this software. Mission teams are not solely working on ISIS. We are currently far beyond the point where we are out of development bandwidth for this PR. I can try to submit the "making globals static" that Kelvin suggested later today, as he is correct, it is part of the app conversation process, which in itself is a requirement retroactively applied to this PR. But after that, we don't have any further available resources to throw at this currently. The retroactive adding of requirements and lack of clear, error-free documentation have made this process grossly inefficient. It is the reason I suggested a walk-through, to identify these "gotcha" late requirements, to avoid further delays. I expect to have suggestions in a code review. I welcome them as they are the purpose of a code review. However, adding changes to code that was already reviewed, and adding new requirements that didn't exist at the time the PR was submitted upon submitting fixes previously requested, causes unnecessary delays. This is not "best practice" nor the best use of development resources. This PR is now 1 year and 7 days old. Please advise. Thank you. |
Those suggestions were just for future reference. They will not keep the PR from being merged |
The comments from @Kelvinrr need to be addressed. I realize you are unhappy about the length that this PR has been open and I have not directly addressed this. On Nov. 18, 2021 I informed you, @victoronline, needed to write tests. 21 days ago, I pinged you, @victoronline, about the status of the tests. The vast majority of time that this PR has been open has been waiting on you to submit tests. Neither my team, nor I have control over the amount of time that it took to get the tests. We have been consistently clear that tests are needed to merge. During the time that this PR has sat dormant, we have become aware of USGS requirements (a domain review) that need to be met on all code that we release publicly. As I indicated above, the requirement that the code released by USGS be of the highest quality is inline with existing best practices. The only difference between when this PR was original submitted and today is that the submitter needs to provide some sort of documentation that the review has been met. Had tests been written promptly around Nov. 18, 2021 we would not have been aware of that requirement. We do not grandfather in code submissions. To be blunt - We will not merge code that does not pass the reviews as we are taking long term maintenance of this code on, not you. Additionally, more time and energy is being spent pushing us to merge this without the needed changes than it requires for you to actually make the changes. What exactly is your goal here? |
@jlaura Thank you for your reply. When I initially submitted the issue and accompanying PR, having gtests written was not a requirement. The understanding at the time, like with the half-pixel fix, was that Jesse would do the gtests since the documentation was not up fully. I mentioned this on [on Nov 17, 2021]. Although notes in this PR do not capture all the comments regarding this, at that time, in our TC meetings, it was mentioned that support would be provided to help mission teams with gtests considering documentation was lacking, and it was new and vastly different from what was in place beforehand. Since then, the delay not only included us not providing gtests, but also delays that were not caused by us. Again, it was mentioned that you all would work with mission teams during this changing process. But just informing users what requirements are now and pointing to documentation, does not, in my opinion, mean working with folks. To also be blunt, working with us to get this PR passed would not have taken you all much effort and would have been documented for future reference. Specifically addressing testing being promptly written, Assistance was repeatedly requested by mission teams via TC meetings. It was mentioned, in reply, multiple times, that support would be provided but, as far as I'm aware, only came in the form of GTest documentation. I requested assistance in the form of a hands-on walkthrough via additional funding during the LROC sprint, and one of your developers very rudely and unprofessionally refused to help, as he wasn't going to "hold our hands" and it's not what "he signed up for". The remaining developer did work with us, and in doing so, we both identified inaccuracies and mistakes in the documentation. It took us both working together, a while to complete one GTest for LRONACPHO. I don't understand the concept of expecting us to promptly write these tests when that wasn't possible with the assistance of one of your developers. I'm not trying to cause any issues but is in increasingly becoming more frustrating. I freely volunteer my time to help in any way I can with this project. You've mentioned that you are interested in working with mission teams under the spirit of collaboration to help move this project to a place where not only core developers are responsible for contributing, but the community as a whole undertakes that responsibility. I'm honestly trying to help with this process. I also understand that your team is tasked with supporting this project long-term. In my opinion, when making such large changes to a project, it is to your benefit to help your users along, not to just demand they "jump on or jump off". This requires flexibility. I don't think the current approach, in my opinion, really helps your cause of bringing folks on board to help with the development effort of this project, and making an example of this PR is not in anyone's best interest. Addressing this comment: "Additionally, more time and energy is being spent pushing us to merge this without the needed changes than it requires for you to actually make the changes. ". I wish that were the case. I've requested the write-up you've asked for from Emerson. We will have to wait until his time frees up as we all are currently working on LROC work but also KPLO, which launches here shortly. I also have to take additional time away from other work to do these changes and also apply them to the other fork that we use internally since we are not yet at the bleeding edge version. This of course has to be retested. I also expect that something will be required of us regarding the test cube that is too large for your tests. So, no, I don't agree with this comment. I'm not asking for you to push without the changes but was asking for assistance with those changes because of the reasons I've mentioned. I know that this discussion is not for this forum and I apologize for my part in causing it to go on this tangent. So I'll just speak about my goal, as you've requested. My goal here is to try to get this PR merged for the LROC users that depend on this to do work related to LROC images. We, unfortunately, don't have any more time available for this PR. I am asking if you all can help push this through. I understand if this cannot be accommodated at this time. Please advise as to the next steps, if that's the case. Would you prefer I keep commenting so the bot won't close this PR until time becomes available later this year or next, or should I close it and open a new one if we have more time available. Again, your help is appreciated and I hope you all are enjoying the weather up there as it is now 200 degrees down here. I plan on doing some mountain biking up there soon. Thanks! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Found other issues with the tests. Looks like it applies to all 3 tests.
QString iCubeFile = "data/lronaccal/input/M1333276014R.cub"; | ||
QString oCubeFile = outputDir.path() + "/out.default.cub"; | ||
QString oCubeCropFile = outputDir.path() + "/out.default.crop.cub"; | ||
QString tCubeFile = "data/lronaccal/truth/M1333276014R.default.crop.cub"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is tcube doing here? it's not used for output but used to check histograms against anyways? Even it was output it should be going into a temp directory.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is comparing against truth cubes so no they aren't output cubes. But again, using as opposed to hard-coded values. But there wasn't guidance as to what would be preferred. I guess I thought it would be better to switch out truth cubes as changes to app occur and the code would not need to be changed in tests.
geometrically transformed (i.e. scaled, rotated, sheared, or | ||
reflected) or cropped | ||
*/ | ||
QString iCubeFile = "data/lronaccal/input/M1333276014R.cub"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this file doesn't exist, only files from what looks like the old truth
directory seems to be there?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is the large cube I sent to Jay. It doesn't exist because it was too large.
EXPECT_NEAR(oCubeStats->Average(), tCubeStats->Average(), 0.001); | ||
EXPECT_NEAR(oCubeStats->Sum(), tCubeStats->Sum(), 0.001); | ||
EXPECT_NEAR(oCubeStats->ValidPixels(), tCubeStats->ValidPixels(), 0.001); | ||
EXPECT_NEAR(oCubeStats->StandardDeviation(), tCubeStats->StandardDeviation(), 0.001); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I image the intention was to use the old truth cube as the baseline? Why not use expected values here and not have a truth cube at all?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is because the values we actually tested and validated are based on a not cropped cub. Honestly, I thought it was best to have it this way where the truth cub can be switched out if the app code changes, instead of leaving hard coded values that would need to be updated.
@Kelvinrr Addressed comments. It's the same way we did it in PR#4512. FYI |
@victoronline This is way too much off topic discussion for this issue. You are expressing fundamental concerns about the process that can either be discussed between LROC and the ASC contributors or between you and the ISIS TC. We have very different perspectives on the course of this PR and what is expected of folks, so following on discussion is absolutely warranted. Thanks for continuing to work with the reviewers to ensure that this code is ready to merge once we have the needed domain review information. |
@victoronline I specifically recall you reduced those images pretty significantly and the tests passing. I think the best course of action considering the sun lock in this PR is to just merge this I can do a follow up PR fixing the tests after either @victoronline or @jlaura sends me those test images. I'll either reduce them for the test or shove them into ISISDATA for now so the test doesn't just fail. I have my and @scsides's suggested changes locally already. |
@jlaura I got it on ISISTESTDATA at |
geometrically transformed (i.e. scaled, rotated, sheared, or | ||
reflected) or cropped | ||
*/ | ||
QString iCubeFile = "data/lronaccal/input/M1333276014R.cub"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The bare minimum to get this merged passing code review is to change this to $ISISTESTDATA/isis/src/lro/apps/lronaccal/M1333276014R.cub
@victoronline for all three tests.
@victoronline what's the status on this? Would you still be able to make the data path change soon? |
geometrically transformed (i.e. scaled, rotated, sheared, or | ||
reflected) or cropped | ||
*/ | ||
QString iCubeFile = "data/lronaccal/input/M1333276014R.cub"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@victoronline any updates on this? Again all we need is ctrl+c ctrl+v of $ISISTESTDATA/isis/src/lro/apps/lronaccal/M1333276014R.cub
over these iCubeFile paths on the tests to pass review.
59cea20
to
4af3b9d
Compare
DOMAIN REVIEW: The documentation I received on this is below. It's a more detailed explanation than exists in the description in the PR above. I don't know if this is what you are looking for because it doesn't go into detail as to how our scientists verified, analyzed, and validated their work. However, it does add more detail to the process used to arrive at the change made to the application. Please let me know if you need further information so I can request it from our science team. The actual equation is in the PR's description. "Under the old version of LRONACCAL, we corrected all NAC left and right observations with a single pair of dark files derived early on during the LRO mission. We noticed a vertical striping in LROC images acquired later in the mission. We identified that these same stripes were also visible in NAC dark observations collected around the same time. We, therefore, developed a time-dependent dark correction to remove artifacts that have changed throughout the mission. To complete this step, we gathered all the dark observations acquired each month and averaged them into a single file representing the dark current response for that period. |
@victoronline This looks good. Are you able to say if this was manually checked on a few images, tens of images, or more images to confirm that the change is the current best known implementation for getting data that your science team wants to see? |
@jlaura I am not able to speak to this but can ask our science team for more information regarding the number of images used to arrive at this result. This is the way our science team is handling the calibration of the NAC to handle verticle striping. What I can say is that It is how we are processing images. If the response confirms that this is the best-known method, will there be additional proof required or just a confirmation? |
@jlaura I may not be understanding your question. Are you also asking if the change is the best-known implementation or are you asking how many images we used to arrive at this conclusion? |
Just confirmation.
I'm going to +1 this and remove my 'Changes requested' based on the above statement. I am curious if this has been broadly applied to the dataset or just to a handful of images. Thanks for the PR. |
Also, this should be included in the 7.1 RC coming out in a few weeks and then officially release ~1 month after that. |
DOI-USGS#4520) * updated to use time dependent darks including darks for exp zero and also allowed passthru of certain special pixel types instead of nulling them * Updated lronaccal.xml documentation Updated documentation to provide more clarity regarding dark file correction/calibration options available to user. * Updated formatting of text * Added better documentation * fixed formatting * removed extra tabs * Add radiance units label in lrowaccal output cube * refactored code to allow callable for compatibility with gtest * added gtests for 3 file types and made app callable * Added changes for lronaccal to CHANGELOG.md Added changes made to lronaccal and gtests to CHANGELOG.md. Co-authored-by: Cordell Michaud <10409047+michaudcordell@users.noreply.github.com>
Update to lronaccal to use time-dependent dark files for dark correction and use of specific dark files for images with exp code of zero.
Description
The dark average produced is dependant on which options are selected.
If the custom dark file option is selected, the provided dark file will be used.
If the image has an exposure code of zero, the nearest (time) dark files with exposure code of zero will be used.
If UseNearestTimeDarkFile option is selected, the dark file with the nearest time to the image will be used for dark correction.
If UseNearestTimeDarkFilePair option is selected, the pair of dark files that the image time lies between will be used, as long as the difference between dark files doesn't exceed 45 days. If a suitable pair is not found, the latest dark file taken before the image will be used.
The equation used:
pixel_dark_average =
avgDarkLine1_pixel * |darkfile1_time - time| + avgDarkLine2_pixel * |darkfile2_time - time|
/ (1.0 * ( |darkFile1_time - time| + |darkFile2_time - time| ) )
How Has This Been Tested?
This has been tested by the science team comparing photometric products using all dark file options and validated the results.
Types of changes
Checklist:
Licensing
This project is mostly composed of free and unencumbered software released into the public domain, and we are unlikely to accept contributions that are not also released into the public domain. Somewhere near the top of each file should have these words: