Replies: 1 comment
-
Discourse link: [ANN] DifferentiationInterface - gradients for everyone. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
[ANN] DifferentiationInterface - gradients for everyone
Julia's composability has rather interesting consequences for its automatic differentiation ecosystem. Whereas Python programmers first choose a backend (like PyTorch or JAX) and then write code that is specifically tailored to it, Julians first write their code and then make it differentiable for one or more of the many available backends (like ForwardDiff.jl or Zygote.jl). Forward and reverse mode, numeric and symbolic, Julia has it all. But it's not always obvious which option is best suited for a given application.
We hold these truths to be self-evident that all autodiff backends are created equal, and that Julia users should be able to quickly try them out without diving into a dozen different APIs to figure out correct syntax and performance tricks. @adrhill and I are thus proud to present DifferentiationInterface.jl, an efficient and flexible interface to every differentiation method you could possibly dream of.
Basics
With DifferentiationInterface, an elementary gradient computation looks like this:
Backend objects are defined by ADTypes.jl and we support a dozen of them, from golden oldies ForwardDiff.jl and Zygote.jl to new fan favorites like Enzyme.jl... and even experimental ones like FastDifferentiation.jl and Tapir.jl!
We provide operators for first-order and second-order differentiation:
derivative
,gradient
,jacobian
,second_derivative
,hessian
, as well as the lower-levelpushforward
,pullback
andhvp
. These operators support only numbers or arrays as inputs and outputs. They use nearly optimal backend-specific implementations whenever possible, and reasonable fallbacks the rest of the time.Check out the tutorial for a simple example, and the overview for a detailed walkthrough.
Performance
The Julia community loves optimizing for maximum performance, and would never give up speed for convenience. That is why DifferentiationInterface was designed from the ground up with type stability and memory efficiency in mind. Users can very easily:
f(x) = y
andf!(y, x)
Advanced
Among the fancier features offered by DifferentiationInterface, let us mention:
DifferentiateWith(f, backend)
, which defines a chain rule to make Zygote.jl use another backend (like Enzyme.jl) for part of the reverse-passTesting and benchmarking
The companion package DifferentiationInterfaceTest.jl allows you to quickly compare autodiff backends, on pre-defined or custom scenarios. It includes:
We think this can be of great use for package developers to test which backends support their functions, and how fast differentiation can be. It has also enabled us to help diagnose and fix bugs in several autodiff packages.
Once again, the tutorial is a great place to start.
What about AbstractDifferentiation?
AbstractDifferentiation.jl was our main inspiration for this work, and we learned a lot of lessons from its design. To alleviate some of its current limitations (notably around mutation and caching), we imposed certain restrictions on our code. DifferentiationInterface accepts only one input
x
and one outputy
, and guarantees support for number or array types but nothing beyond that. Given the existence of ComponentArrays.jl, these rules seem fairly mild, and they made our life considerably easier for robust implementation and testing.Still, some use cases may require differentiation with respect to multiple inputs or non-array types. This level of generality has always been the goal of AbstractDifferentiation, and so it will remain.
After discussing the matter with @mohamed82008 to avoid XKCD #927, the way forward looks like this:
Roadmap
Now that the foundations are solid, here are some things we would like to work on next:
Interested?
DifferentiationInterface is not yet completely stable, but it follows semantic versioning, and the latest v0.3 release is very well tested.
We encourage you to try it out in your own projects, and open an issue if you find yourself struggling.
Oh, and of course, a star on the GitHub repo would be very much appreciated ;)
Beta Was this translation helpful? Give feedback.
All reactions