-
Notifications
You must be signed in to change notification settings - Fork 33
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
cmor.write segfault related to timesteps #440
Comments
@doutriaux1 I am currently trying to make a way to check if the number of times passed exceed the length of the time axis. Lines 4392 to 4412 in 7c7e48a
I am not sure how well it handles cases where it is writing data to a zfactor. I tried putting this check below the call to Lines 2441 to 2442 in 7c7e48a
It currently passes all of the current tests in the test suite. I also made a test to see if it gets the error for passing more times to cmor_write than the length of the time axis. Are there any other tests I could throw at it? |
@chloemackallah I would like to revisit this issue by figuring out how to replicate the error you were experiencing. Could you submit some sample code that causes this error?
I wonder if this was already solved in #485, at least for Python code. |
@chloemackallah Do you think this issue can be closed now? |
I will close this for now. |
In recent testing of CMOR with the ACCESS model, we found an issue when attempting to use cmor.write, but with an incorrect value assigned to argument 'ntimes_passed'.
In this scenario, when the value of 'ntimes_passed' was inconsistent with the actual number of timesteps contained within the data that is passed to cmor.write, the result is a segmentation fault. This made it difficult to track the source of the error.
We would suggest an instructional error message be output if this occurs rather than a segmentation fault. Additionally, perhaps it would be possible to implement a check that the number of time values is consistent and initialise the dimension length accordingly.
The text was updated successfully, but these errors were encountered: