AutoDiffOperators.jl

This package provides multiplicative operators that act via automatic differentiation (AD), as well as additional AD-related functionality.

AD-backends are specified via subtypes of ADSelector, which includes ADTypes.AbstractADType. separate backends for forward and reverse mode AD can be specified if desired.

The main functions are with_gradient and with_jacobian. The central lower-level functions are with_jvp and with_vjp_func. Jacobian operators can be implicit (e.g. a LinearMap/FunctionMap or similar) or explixit (i.e. a Matrix).

Different Julia packages require function and gradient calculation to be passed in a different fashion. AutoDiffOperators provides

to cover several popular options.

AutoDiffOperators natively supports the following automatic differentiation packages as backends:

  • FiniteDifferences, selected via ADTypes.AutoFiniteDifferences() or ADSelector(FiniteDifferences).

  • ForwardDiff, selected via ADTypes.AutoForwardDiff() or ADSelector(ForwardDiff).

  • Zygote, selected via ADTypes.AutoZygote() or ADSelector(Zygote).

  • Enzyme, selected via ADTypes.AutoEnzyme() or ADSelector(Enzyme).

Alternatively, DifferentiationInterface` can be used to interface with various AD-backends, by using DiffIfAD(backend::ADTypes.AbstractADType) as the AD-selector.

AutoDiffOperators may not support all options and funcionalities of these AD packages. Also, most of them have some limitations on which code constructs in the target function and which function argument types they support. Which backend(s) will perform best will depend on the target function and the argument size, as well as the application (J*z vs. z'*J and gradient calculation) and the compute device (CPU vs. GPU). Please see the documentation of the individual AD packages linked above for more details on their capabilities.