ImplicitDifferentiation.jl is a package for automatic differentiation of functions defined implicitly, i.e., forward mappings
whose output is defined by conditions
Implicit differentiation is useful to differentiate through two types of functions:
- Those for which automatic differentiation fails. Reasons can vary depending on your backend, but the most common include calls to external solvers, mutating operations or type restrictions.
- Those for which automatic differentiation is very slow. A common example is iterative procedures like fixed point equations or optimization algorithms.
If you just need a quick overview, check out our JuliaCon 2022 talk. If you want a deeper dive into the theory, you can refer to the paper Efficient and modular implicit differentiation by Blondel et al. (2022).
To install the stable version, open a Julia REPL and run:
using Pkg; Pkg.add("ImplicitDifferentiation")
For the latest version, run this instead:
using Pkg; Pkg.add(url="https://github.com/JuliaDecisionFocusedLearning/ImplicitDifferentiation.jl")
Please read the documentation, especially the examples and FAQ.
In Julia:
- SciML ecosystem, especially LinearSolve.jl, NonlinearSolve.jl and Optimization.jl
- jump-dev/DiffOpt.jl: differentiation of convex optimization problems
- axelparmentier/InferOpt.jl: approximate differentiation of combinatorial optimization problems
- JuliaNonconvex/NonconvexUtils.jl: contains the original implementation from which this package drew inspiration
In Python:
- google/jaxopt: hardware accelerated, batchable and differentiable optimizers in JAX