API

Modules

Types and constants

Functions and macros

Documentation

AutoDiffOperators.FwdRevADSelectorType
AutoDiffOperators.FwdRevADSelector{Fwd<:ADSelector,Rev<:ADSelector} <: ADSelector

Represent an automatic differentiation backend that forwards forward-mode and reverse-mode AD to two separate selectors fwd::ADSelector and rev::ADSelector.

User code should not instantiate AutoDiffOperators.FwdRevADSelector directly, but use ADSelector(fwd, rev) or ADSelector(fwd = fwd, rev = rev) instead.

source
AutoDiffOperators.ADSelectorType
const ADSelector = Union{
    ADTypes.AbstractADType,
    WrappedADSelector
}

Instances speficy an automatic differentiation backend.

Either a subtype of [ADTypes.AbstractADType]](https://github.com/SciML/ADTypes.jl), or an AD-selector wrapper like AutoDiffOperators.FwdRevADSelector.

AutoDiffOperators currently provides it's own implementations for following AD-selectors: AutoForwardDiff(), AutoFiniteDifferences(), AutoZygote() and AutoEnzyme().

ADSelector (specifically ADTypes.AbstractADType ) instances for these backends can be constructed directly from modules and modules names (using AutoDiffOperators default backend parameters):

import ForwardDiff
ADSelector(ForwardDiff)
ADSelector(:ForwardDiff)
ADSelector(Val(:ForwardDiff))
convert(ADSelector, ForwardDiff)
convert(ADSelector, :ForwardDiff)
convert(ADSelector, Val(:ForwardDiff))

Some operations that specifically require forward-mode or reverse-mode AD will only accept a subset of these backends though.

Alternatively, DifferentiationInterface` can be used to interface with various AD-backends, by using DiffIfAD(backend::ADTypes.AbstractADType) as the AD-selector.

Implementation

The following functions must be specialized for subtypes of ADSelector: with_jvp, with_vjp_func.

AutoDiffOperators.supports_structargs should be specialized if applicable.

A default implementation is provided for with_gradient, but specialized implementations may often be more performant.

Selector types that delegate forward and/or reverse-mode AD to other selector types resp AD-backends should also specialize forward_ad_selector and reverse_ad_selector.

source
AutoDiffOperators.mulfunc_operatorFunction
AutoDiffOperators.mulfunc_operator(
    ::Type{OP},
    ::Type{T}, sz::Dims{2}, ovp, vop,
    ::Val{sym}, ::Val{herm}, ::Val{posdef}
) where {OP, T<:Real, sym, herm, posdef}

Generates a linear operator object of type OP that supports multiplication and with (adjoint) vectors based on a multiplication function ovp and an adjoint multiplication function vop.

An operator op = mulfunc_operator(OP, T, sz, ovp, vop, Val(sym), ::Val{herm}, ::Val{posdef}) must show show following behavior:

op isa OP
eltype(op) == T
size(op) == sz
op * x_r == ovp(x_r)
x_l' * op == vop(x_l)
issymmetric(op) == sym
ishermitian(op) == herm
isposdef(op) = posdef

where x_l and x_r are vectors of size sz[1] and sz[2] respectively.

source
AutoDiffOperators.similar_onehotFunction
AutoDiffOperators.similar_onehot(A::AbstractArray, ::Type{T}, n::Integer, i::Integer)

Return an array similar to A, but with n elements of type T, all set to zero but the i-th element set to one.

source
AutoDiffOperators.supports_structargsFunction
AutoDiffOperators.supports_structargs(ad::ADSelector)::Boolean

Returns true if ad supports structured function arguments or false if ad only supports vectors of real numbers.

Since ad may use different backends for forward- and reverse-mode AD, use supports_structargs(forward_ad_selector(ad)) and supports_structargs(reverse_ad_selector(ad)) to check if ad supports structured arguments for the desired operation.

source
AutoDiffOperators.with_gradient!!Function
with_gradient!!(f, δx, x, ad::ADSelector)

Returns a tuple (f(x), ∇f(x)) with the gradient ∇f(x)offatx`.

δx may or may not be reused/overwritten and returned as ∇f(x).

The default implementation falls back to with_gradient(f, x, ad), subtypes of ADSelector may specialized with_gradient!! to provide more efficient implementations.

source
AutoDiffOperators.with_jacobianFunction
with_jacobian(f, x, OP, ad::ADSelector)

Returns a tuple (f(x), J) with a multiplicative Jabobian operator J of type OP.

Example:

using AutoDiffOperators, LinearMaps
y, J = with_jacobian(f, x, LinearMap, ad)
y == f(x)
_, J_explicit = with_jacobian(f, x, Matrix, ad)
J * z_r ≈ J_explicit * z_r
z_l' * J ≈ z_l' * J_explicit

OP may be LinearMaps.LinearMap (resp. LinearMaps.FunctionMap) or Matrix. Other operator types can be supported by specializing AutoDiffOperators.mulfunc_operator for the operator type.

The default implementation of with_jacobian uses jvp_func and with_vjp_func to implement (adjoint) multiplication of J with (adjoint) vectors.

source