You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
#481 adds the Jaxified IDAKLU solver as an experimental implementation with auto-differentiation applied to the cost/likelihood functions. This issue aims to expand this functionality with Jax inference methods such as:
Jax offers a compiled interface for parameter optimisation with lowering to both GPU/TPU. This can enable both performance improvements for PyBOP's methods, as well as removing the need for manual definition of gradients from cost/likelihoods.
Possible implementation
Design outlines and discussion needs to occur to ensure an integrated development into PyBOP's predefined design.
Additional context
No response
The text was updated successfully, but these errors were encountered:
Feature description
#481 adds the Jaxified IDAKLU solver as an experimental implementation with auto-differentiation applied to the cost/likelihood functions. This issue aims to expand this functionality with Jax inference methods such as:
Motivation
Jax offers a compiled interface for parameter optimisation with lowering to both GPU/TPU. This can enable both performance improvements for PyBOP's methods, as well as removing the need for manual definition of gradients from cost/likelihoods.
Possible implementation
Design outlines and discussion needs to occur to ensure an integrated development into PyBOP's predefined design.
Additional context
No response
The text was updated successfully, but these errors were encountered: