Package | Docs |
---|---|
DifferentiationInterface | |
DifferentiationInterfaceTest |
An interface to various automatic differentiation (AD) backends in Julia.
This package provides a unified syntax to differentiate functions, including:
- First- and second-order operators (gradients, Jacobians, Hessians and more)
- In-place and out-of-place differentiation
- Preparation mechanism (e.g. to pre-allocate a cache or record a tape)
- Built-in sparsity handling
- Thorough validation on standard inputs and outputs (numbers, vectors, matrices)
- Testing and benchmarking utilities accessible to users with DifferentiationInterfaceTest
We support the following backends defined by ADTypes.jl:
- ChainRulesCore.jl
- Diffractor.jl (currently broken)
- Enzyme.jl
- FastDifferentiation.jl
- FiniteDiff.jl
- FiniteDifferences.jl
- ForwardDiff.jl
- Mooncake.jl
- PolyesterForwardDiff.jl
- ReverseDiff.jl
- Symbolics.jl
- Tracker.jl
- Zygote.jl
Note that in some cases, going through DifferentiationInterface.jl might be slower than a direct call to the backend's API. This is mostly true for Enzyme.jl, whose handling of activities and multiple arguments unlocks additional performance. We are working on this challenge, and welcome any suggestions or contributions. Meanwhile, if differentiation fails or takes too long, consider using Enzyme.jl directly.
To install the stable version of the package, run the following code in a Julia REPL:
using Pkg
Pkg.add("DifferentiationInterface")
To install the development version, run this instead:
using Pkg
Pkg.add(
url="https://github.com/JuliaDiff/DifferentiationInterface.jl",
subdir="DifferentiationInterface"
)
using DifferentiationInterface
import ForwardDiff, Enzyme, Zygote # AD backends you want to use
f(x) = sum(abs2, x)
x = [1.0, 2.0]
value_and_gradient(f, AutoForwardDiff(), x) # returns (5.0, [2.0, 4.0]) with ForwardDiff.jl
value_and_gradient(f, AutoEnzyme(), x) # returns (5.0, [2.0, 4.0]) with Enzyme.jl
value_and_gradient(f, AutoZygote(), x) # returns (5.0, [2.0, 4.0]) with Zygote.jl
To improve your performance by up to several orders of magnitude compared to this example, take a look at the tutorial and its section on operator preparation.
Please cite both DifferentiationInterface.jl and its inspiration AbstractDifferentiation.jl, using the provided CITATION.bib
file.