-
-
Notifications
You must be signed in to change notification settings - Fork 553
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Using JSON to import comsol results #4064
Conversation
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
If this works then I am happy with it. It looks like the lychee thing is unrelated |
Lychee is fixed here #4065 |
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## develop #4064 +/- ##
========================================
Coverage 99.58% 99.58%
========================================
Files 260 260
Lines 21358 21358
========================================
Hits 21270 21270
Misses 88 88 ☔ View full report in Codecov by Sentry. |
import json
import pickle
import numpy as np
files = ["comsol_01C.json","comsol_05C.json", "comsol_1C.json", "comsol_1plus1D_3C.json","comsol_2C.json", "comsol_3C.json"]
for file in files:
with open(file.split('.')[0] + ".pickle", 'rb') as infile:
pickleobj = pickle.load(infile)
for i in pickleobj:
if i == 'solution_time':
pickleobj[i] = None
continue
else:
pickleobj[i] = pickleobj[i].tolist()
with open(file, 'w', encoding='utf-8') as outfile:
json.dump(pickleobj, outfile, ensure_ascii=False, indent=4) For reference, I am attaching the code I used to convert the pickle data with multi-dim arrays to JSON. JSON doesn't allow |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, @santacodes! The Windows failure is unrelated, I just triggered a re-run for it. I haven't verified all of the JSON files, but I did so for one or two of them (one can't see the diffs on GitHub – they are quite big!1) on my machine, and after the conversion to arrays, the elements seem to be matching one-for-one (this should be better than the previous iteration where I felt there were missing values due to the presence of ...
in the strings) which is great.
For the example notebooks tests, it looks like there is one more file that you need to fix :) Otherwise, the changes look quite good to me.
Footnotes
-
GitHub displays the number of lines changed as
+878,463 −36
. I am assuming that this is because 2D data was converted to 1D and values from keys such asc_n_surf
,c_e
,c_p_surf
,phi_n
,phi_p
and so on are being displayed on a single line with multiple[]
-style JSON arrays ↩
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me other than Agriya's changes. Would be good to keep the COMSOL results in a separate location e.g. using pooch as discussed on Slack. That can be a separate PR
The only trouble that I feel this will cause is that it would require an active internet connection for those who would wish to use such data for their experiments, but then for someone to download and install PyBaMM, doing that requires an internet connection anyway... so I agree that this can be done overall – @santacodes, would you be willing to do this further? I just created a new repository for this at https://github.com/pybamm-team/pybamm-data, and named it as such rather than giving a COMSOL-specific name because we can move other CSV files for drive cycles, Enertech cells, etc. as well. I'll copy the files over there once this PR gets merged. |
In the absence of a network connection, we will have to skip relevant tests that use these data files. The |
Sure I am willing to work on it, I'll get started once this PR is merged. |
Just an aside, but I am not able to use nox for testing without connection anyway. I think if the files were handled in a way that they are stored elsewhere and downloaded only if they don't exist would solve most issues for that |
This is because
Yes, |
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, @santacodes! All of the notebook outputs look identical. Yes, leaving out changes to MANIFEST.in
is alright because these files will be moved to a separate repository soon anyway
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we add this commit hash to .git-blame-ignore-revs
?
I don't think we should, because these are relevant code changes and they will not spoil the git blame (the revisions are not scattered across the codebase but are rather in a specific directory). |
I edited the PR description so that it does not close the linked issue, we can track the second half there. I'll edit the issue description. |
* using json to import comsol results * migrated comsol results to json * code review suggestions * fixed failing pouch_cell * refined gitignore --------- Co-authored-by: Eric G. Kratz <kratman@users.noreply.github.com> Co-authored-by: Arjun Verma <arjunverma.oc@gmail.com> Co-authored-by: Agriya Khetarpal <74401230+agriyakhetarpal@users.noreply.github.com>
Description
The default imported data from JSON has list datatype so a bit of pre-processing is done before it can be used to plot.
np.array()
converts the multi-dim list to multi-dim numpy array.Addresses a part of #4026
Type of change
Please add a line in the relevant section of CHANGELOG.md to document the change (include PR #) - note reverse order of PR #s. If necessary, also add to the list of breaking changes.
Key checklist:
$ pre-commit run
(or$ nox -s pre-commit
) (see CONTRIBUTING.md for how to set this up to run automatically when committing locally, in just two lines of code)$ python run-tests.py --all
(or$ nox -s tests
)$ python run-tests.py --doctest
(or$ nox -s doctests
)You can run integration tests, unit tests, and doctests together at once, using
$ python run-tests.py --quick
(or$ nox -s quick
).Further checks: