API
Modules
Types and constants
AutoDiffOperators.ADSelector
AutoDiffOperators.DiffIfAD
AutoDiffOperators.FwdRevADSelector
AutoDiffOperators.WrappedADSelector
Functions and macros
AutoDiffOperators.forward_ad_selector
AutoDiffOperators.gradient!_func
AutoDiffOperators.gradient_func
AutoDiffOperators.jvp_func
AutoDiffOperators.mulfunc_operator
AutoDiffOperators.only_gradient
AutoDiffOperators.reverse_ad_selector
AutoDiffOperators.similar_onehot
AutoDiffOperators.supports_structargs
AutoDiffOperators.valgrad_func
AutoDiffOperators.with_floatlike_contents
AutoDiffOperators.with_gradient
AutoDiffOperators.with_gradient!!
AutoDiffOperators.with_jacobian
AutoDiffOperators.with_jvp
AutoDiffOperators.with_vjp_func
Documentation
AutoDiffOperators.AutoDiffOperators
— ModuleAutoDiffOperators
Provides Julia operators that act via automatic differentiation.
AutoDiffOperators.DiffIfAD
— TypeDiffIfAD{Fwd<:ADSelector,Rev<:ADSelector} <: ADSelector
Uses DifferentiationInterfac to interface with an AD-backend.
Constructor: DiffIfAD(backend::ADTypes.AbstractADType)
AutoDiffOperators.FwdRevADSelector
— TypeAutoDiffOperators.FwdRevADSelector{Fwd<:ADSelector,Rev<:ADSelector} <: ADSelector
Represent an automatic differentiation backend that forwards forward-mode and reverse-mode AD to two separate selectors fwd::ADSelector
and rev::ADSelector
.
User code should not instantiate AutoDiffOperators.FwdRevADSelector
directly, but use ADSelector(fwd, rev)
or ADSelector(fwd = fwd, rev = rev)
instead.
AutoDiffOperators.WrappedADSelector
— Typeabstract type AutoDiffOperators.WrappedADSelector
Supertype for AD selectors that wrap other AD selectors.
AutoDiffOperators.ADSelector
— Typeconst ADSelector = Union{
ADTypes.AbstractADType,
WrappedADSelector
}
Instances speficy an automatic differentiation backend.
Either a subtype of [ADTypes.AbstractADType
]](https://github.com/SciML/ADTypes.jl), or an AD-selector wrapper like AutoDiffOperators.FwdRevADSelector
.
AutoDiffOperators currently provides it's own implementations for following AD-selectors: AutoForwardDiff()
, AutoFiniteDifferences()
, AutoZygote()
and AutoEnzyme()
.
ADSelector
(specifically ADTypes.AbstractADType
) instances for these backends can be constructed directly from modules and modules names (using AutoDiffOperators
default backend parameters):
import ForwardDiff
ADSelector(ForwardDiff)
ADSelector(:ForwardDiff)
ADSelector(Val(:ForwardDiff))
convert(ADSelector, ForwardDiff)
convert(ADSelector, :ForwardDiff)
convert(ADSelector, Val(:ForwardDiff))
Some operations that specifically require forward-mode or reverse-mode AD will only accept a subset of these backends though.
Alternatively, DifferentiationInterface
` can be used to interface with various AD-backends, by using DiffIfAD(backend::ADTypes.AbstractADType)
as the AD-selector.
Implementation
The following functions must be specialized for subtypes of ADSelector
: with_jvp
, with_vjp_func
.
AutoDiffOperators.supports_structargs
should be specialized if applicable.
A default implementation is provided for with_gradient
, but specialized implementations may often be more performant.
Selector types that delegate forward and/or reverse-mode AD to other selector types resp AD-backends should also specialize forward_ad_selector
and reverse_ad_selector
.
AutoDiffOperators.forward_ad_selector
— Functionforward_ad_selector(ad::ADSelector)::ADSelector
Returns the forward-mode AD backen selector for ad
.
Returns ad
itself by default. Also see FwdRevADSelector
.
AutoDiffOperators.gradient!_func
— Functiongradient!_func(f, ad::ADSelector)
Returns a tuple (f, ∇f!)
with the functions f(x)
and ∇f!(δx, x)
.
AutoDiffOperators.gradient_func
— Functiongradient_func(f, ad::ADSelector)
Returns a tuple (f, ∇f)
with the functions f(x)
and ∇f(x)
.
AutoDiffOperators.jvp_func
— Methodjvp_func(f, x, ad::ADSelector)
Returns a function jvp
with jvp(z) == J * z
.
AutoDiffOperators.mulfunc_operator
— FunctionAutoDiffOperators.mulfunc_operator(
::Type{OP},
::Type{T}, sz::Dims{2}, ovp, vop,
::Val{sym}, ::Val{herm}, ::Val{posdef}
) where {OP, T<:Real, sym, herm, posdef}
Generates a linear operator object of type OP
that supports multiplication and with (adjoint) vectors based on a multiplication function ovp
and an adjoint multiplication function vop
.
An operator op = mulfunc_operator(OP, T, sz, ovp, vop, Val(sym), ::Val{herm}, ::Val{posdef})
must show show following behavior:
op isa OP
eltype(op) == T
size(op) == sz
op * x_r == ovp(x_r)
x_l' * op == vop(x_l)
issymmetric(op) == sym
ishermitian(op) == herm
isposdef(op) = posdef
where x_l
and x_r
are vectors of size sz[1]
and sz[2]
respectively.
AutoDiffOperators.only_gradient
— Functiononly_gradient(f, x, ad::ADSelector)
Returns the gradient ∇f(x) of f
at x
.
See also with_gradient(f, x, ad)
.
AutoDiffOperators.reverse_ad_selector
— Functionreverse_ad_selector(ad::ADSelector)::ADSelector
Returns the reverse-mode AD backen selector for ad
.
Returns ad
itself by default. Also see FwdRevADSelector
.
AutoDiffOperators.similar_onehot
— FunctionAutoDiffOperators.similar_onehot(A::AbstractArray, ::Type{T}, n::Integer, i::Integer)
Return an array similar to A
, but with n
elements of type T
, all set to zero but the i
-th element set to one.
AutoDiffOperators.supports_structargs
— FunctionAutoDiffOperators.supports_structargs(ad::ADSelector)::Boolean
Returns true
if ad
supports structured function arguments or false
if ad
only supports vectors of real numbers.
Since ad
may use different backends for forward- and reverse-mode AD, use supports_structargs(forward_ad_selector(ad))
and supports_structargs(reverse_ad_selector(ad))
to check if ad
supports structured arguments for the desired operation.
AutoDiffOperators.valgrad_func
— Functionvalgrad_func(f, ad::ADSelector)
Returns a function f_∇f
that calculates the value and gradient of f
at given points, so that f_∇f(x)
is equivalent to with_gradient(f, x, ad)
.
AutoDiffOperators.with_floatlike_contents
— FunctionAutoDiffOperators.with_floatlike_contents(A::AbstractArray)
If the elements of A
are integer-like, convert them using float
, otherwise return A
unchanged.
AutoDiffOperators.with_gradient
— Functionwith_gradient(f, x, ad::ADSelector)
Returns a tuple (f(x), ∇f(x)) with the gradient ∇f(x) of f
at x
.
See also with_gradient!!(f, δx, x, ad)
for the "maybe-in-place" variant of this function.
AutoDiffOperators.with_gradient!!
— Functionwith_gradient!!(f, δx, x, ad::ADSelector)
Returns a tuple (f(x), ∇f(x)) with the gradient ∇f(x)
of
fat
x`.
δx
may or may not be reused/overwritten and returned as ∇f(x)
.
The default implementation falls back to with_gradient(f, x, ad)
, subtypes of ADSelector
may specialized with_gradient!!
to provide more efficient implementations.
AutoDiffOperators.with_jacobian
— Functionwith_jacobian(f, x, OP, ad::ADSelector)
Returns a tuple (f(x), J)
with a multiplicative Jabobian operator J
of type OP
.
Example:
using AutoDiffOperators, LinearMaps
y, J = with_jacobian(f, x, LinearMap, ad)
y == f(x)
_, J_explicit = with_jacobian(f, x, Matrix, ad)
J * z_r ≈ J_explicit * z_r
z_l' * J ≈ z_l' * J_explicit
OP
may be LinearMaps.LinearMap
(resp. LinearMaps.FunctionMap
) or Matrix
. Other operator types can be supported by specializing AutoDiffOperators.mulfunc_operator
for the operator type.
The default implementation of with_jacobian
uses jvp_func
and with_vjp_func
to implement (adjoint) multiplication of J
with (adjoint) vectors.
AutoDiffOperators.with_jvp
— Functionwith_jvp(f, x, z, ad::ADSelector)
Returns a tuple (f(x), J * z)
.
AutoDiffOperators.with_vjp_func
— Functionwith_vjp_func(f, x, ad::ADSelector)
Returns a tuple (f(x), vjp)
with the function vjp(z) ≈ J' * z
.