AutoDiffOperators.jl

This package provides multiplicative operators that act via automatic differentiation (AD), as well as additional AD-related functionality.

For Jacobians, this package provides the function with_jacobian. It can return implicit Jacobian operators (e.g. a LinearMap/FunctionMap or similar) or explicit Jacobian operators (i.e. a DenseMatrix). Lower-level functions with_jvp, jvp_func, with_vjp_func are provided as well.

In respect to gradients, different Julia algorithm packages require function and gradient calculation to be passed in a different fashion. AutoDiffOperators provides

to cover several popular options. AutoDiffOperators also provides the direct gradient functions

AD-backends are specified via subtypes of ADSelector, which includes ADTypes.AbstractADType. In addition to using subtypes of AbstractADType directly, you can use ADSelector(SomeADModule) (e.g. ADSelector(ForwardDiff)) to select a backend with default options. Separate backends for forward and reverse mode AD can be specified via ADSelector(fwd_adtype, rev_adtype).

Examples for valid ADSelector choices:

ADTypes.AutoForwardDiff()
ADSelector(ForwardDiff)
ADSelector(ADTypes.AutoForwardDiff(), ADTypes.AutoMooncake())
ADSelector(ADSelector(ForwardDiff), ADSelector(Mooncake))
ADSelector(ForwardDiff, Mooncake)

AutoDiffOperators re-exports AbstractADType and NoAutoDiff from ADTypes, so that algorithm packages that use AutoDiffOperators may not need to also depend on ADTypes explicitly.

AutoDiffOperators uses DifferentiationInterface internally to interact with the various Julia AD backend packages, adding some specializations and optimizations for type stability and performance. Which backend(s) will perform best for a given use case will depend on the target function and the argument size, as well as the application (J*z vs. z'*J and gradient calculation) and the compute device (CPU vs. GPU). Please see the documentation of the individual AD backend packages for details.