Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Define prepare_alg dispatch for OrdinaryDiffEq AD algorithms #1470

Merged
merged 17 commits into from
Aug 14, 2021

Conversation

ChrisRackauckas
Copy link
Member

@ChrisRackauckas ChrisRackauckas force-pushed the prepare_alg branch 2 times, most recently from 94ff654 to 8a0a2b5 Compare August 12, 2021 00:48
@ChrisRackauckas ChrisRackauckas merged commit d6cc30c into master Aug 14, 2021
@ChrisRackauckas ChrisRackauckas deleted the prepare_alg branch August 14, 2021 12:13
@chriselrod
Copy link
Contributor

This may be the primary cause of the inference regression in Pumas.

@ChrisRackauckas
Copy link
Member Author

This PR causes the biggest decrease to compile times 😅

@chriselrod
Copy link
Contributor

chriselrod commented Nov 4, 2021

Well, commenting out the function barrier prepare_alg method reduced inference of the first solve on a single subject in Pumas from 16.8 down to 9.7 seconds.

@ChrisRackauckas
Copy link
Member Author

But having this allows for the things to fully precompile, which is the full 22 to 3 seconds precompile 🌊

@chriselrod
Copy link
Contributor

chriselrod commented Nov 4, 2021

Yeah, I think a better fix is to just avoid the type instability in Pumas (by passing the chunk size in there), which requires #1521.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants