Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update PL_unsym tensor in trapping_sr3.py #259

Merged
merged 1 commit into from
Oct 24, 2022

Conversation

SCLiao47
Copy link
Contributor

Correct the indexing of projection tensor PL_unsym. Detail can be found in issue #258.

Correct the indexing of projection tensor PL_unsym.
@codecov-commenter
Copy link

codecov-commenter commented Oct 24, 2022

Codecov Report

Base: 92.69% // Head: 92.69% // No change to project coverage 👍

Coverage data is based on head (1d183b9) compared to base (bc131d9).
Patch coverage: 100.00% of modified lines in pull request are covered.

Additional details and impacted files
@@           Coverage Diff           @@
##           master     #259   +/-   ##
=======================================
  Coverage   92.69%   92.69%           
=======================================
  Files          34       34           
  Lines        3475     3475           
=======================================
  Hits         3221     3221           
  Misses        254      254           
Impacted Files Coverage Δ
pysindy/optimizers/trapping_sr3.py 90.25% <100.00%> (ø)

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

☔ View full report at Codecov.
📢 Do you have feedback about the report comment? Let us know in this issue.

@akaptano akaptano merged commit 1e1b9bd into dynamicslab:master Oct 24, 2022
jpcurbelo pushed a commit to jpcurbelo/pysindy_fork that referenced this pull request Apr 30, 2024
Update PL_unsym tensor in trapping_sr3.py
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants