API

Modules

Types and constants

Functions and macros

Documentation

AutoDiffOperators.FwdRevADSelectorType
AutoDiffOperators.FwdRevADSelector{Fwd<:ADSelector,Rev<:ADSelector} <: ADSelector

Represent an automatic differentiation backend that forwards forward-mode and reverse-mode AD to two separate selectors fwd::ADSelector and rev::ADSelector.

User code should not instantiate AutoDiffOperators.FwdRevADSelector directly, but use ADSelector(fwd, rev) or ADSelector(fwd = fwd, rev = rev) instead.

source
AutoDiffOperators.ADSelectorType
const ADSelector = Union{
    AbstractADType,
    WrappedADSelector
}

Instances specify an automatic differentiation backend.

Either a subtype of ADTypes.AbstractADType, or an AD-selector wrapper like AutoDiffOperators.FwdRevADSelector.

In addition to using instances of AbstractADType directly (e.g. ADTypes.AutoForwardDiff()), ADSelector (specifically AbstractADType) instances for AD backends can be constructed directly from the backend modules (using default backend parameters):

import ForwardDiff

ADTypes.AutoForwardDiff()
ADSelector(ForwardDiff)
convert(ADSelector, ForwardDifsf)

all construct an identical AutoForwardDiff object.

Separate AD backends for forward- and reverse-mode AD can be specified via ADSelector(fwd_adtype, rev_adtype), e.g.

import ForwardDiff, Mooncake

ADSelector(ADTypes.AutoForwardDiff(), ADTypes.AutoMooncake())
ADSelector(ADSelector(ForwardDiff), ADSelector(Mooncake))
ADSelector(ForwardDiff, Mooncake)

Implementation

ADSelector instances can also be constructed from module names, though this should be avoided in end-user code:

ADSelector(:ForwardDiff)
ADSelector(Val(:ForwardDiff))
convert(ADSelector, :ForwardDiff)
convert(ADSelector, Val(:ForwardDiff))

End-users should use module objects instead of module name, so that the respective AD backend package must be part of their environment/dependencies.

source
AutoDiffOperators.forward_adtypeFunction
forward_adtype(ad::ADSelector)::ADTypes.AbstractADType

Returns the forward-mode AD backend selector for ad.

Returns ad itself by default if ad supports forward-mode automatic differentation, or instance of ADTypes.NoAutoDiff if it does not.

May be specialized for some AD selector types, see FwdRevADSelector, for example.

source
AutoDiffOperators.mulfunc_operatorFunction
AutoDiffOperators.mulfunc_operator(
    ::Type{OP},
    ::Type{T}, sz::Dims{2}, ovp, vop,
    ::Val{sym}, ::Val{herm}, ::Val{posdef}
) where {OP, T<:Real, sym, herm, posdef}

Generates a linear operator object of type OP that supports multiplication and with (adjoint) vectors based on a multiplication function ovp and an adjoint multiplication function vop.

An operator op = mulfunc_operator(OP, T, sz, ovp, vop, Val(sym), ::Val{herm}, ::Val{posdef}) must show show following behavior:

op isa OP
eltype(op) == T
size(op) == sz
op * x_r == ovp(x_r)
x_l' * op == vop(x_l)
issymmetric(op) == sym
ishermitian(op) == herm
isposdef(op) = posdef

where x_l and x_r are vectors of size sz[1] and sz[2] respectively.

source
AutoDiffOperators.only_gradient!!Function
only_gradient!!(f, δx, x::AbstractVector{<:Number}, ad::ADSelector)

Returns the gradient ∇f(x) of f at x.

δx may or may not be reused/overwritten and returned as ∇f(x).

source
AutoDiffOperators.reverse_adtypeFunction
reverse_adtype(ad::ADSelector)::ADTypes.AbstractADType

Returns the reverse-mode AD backend selector for ad.

Returns ad itself by default if ad supports reverse-mode automatic differentation, or instance of ADTypes.NoAutoDiff if it does not.

May be specialized for some AD selector types, see FwdRevADSelector, for example.

source
AutoDiffOperators.similar_onehotFunction
AutoDiffOperators.similar_onehot(A::AbstractArray, ::Type{T}, n::Integer, i::Integer)

Return an array similar to A, but with n elements of type T, all set to zero but the i-th element set to one.

source
AutoDiffOperators.with_gradient!Function
with_gradient!(f, δx, x::AbstractVector{<:Number}, ad::ADSelector)

Fills δx with the the gradient ∇f(x) of f at x and returns the tuple (f(x), δx).

source
AutoDiffOperators.with_gradient!!Function
with_gradient!!(f, δx, x::AbstractVector{<:Number}, ad::ADSelector)

Returns a tuple (f(x), ∇f(x)) with the gradient ∇f(x) of f at x.

δx may or may not be reused/overwritten and returned as ∇f(x).

source
AutoDiffOperators.with_jacobianFunction
with_jacobian(f, x::AbstractVector{<:Number}, OP, ad::ADSelector)

Returns a tuple (f(x), J) with a multiplicative Jacobian operator J of type OP.

Example:

using AutoDiffOperators, LinearMaps
y, J = with_jacobian(f, x, LinearMap, ad)
y == f(x)
_, J_explicit = with_jacobian(f, x, DenseMatrix, ad)
J * z_r ≈ J_explicit * z_r
z_l' * J ≈ z_l' * J_explicit

OP may be LinearMaps.LinearMap (resp. LinearMaps.FunctionMap) or Matrix. Other operator types can be supported by specializing mulfunc_operator for the operator type.

The default implementation of with_jacobian uses jvp_func and with_vjp_func to implement (adjoint) multiplication of J with (adjoint) vectors.

source
AutoDiffOperators.with_jvpFunction
with_jvp(f, x::AbstractVector{<:Number}, z::AbstractVector{<:Number}, ad::ADSelector)

Returns a tuple (f(x), J * z).

source