diff --git a/dev/api/index.html b/dev/api/index.html index 4e9f7c8..7a27566 100644 --- a/dev/api/index.html +++ b/dev/api/index.html @@ -1,7 +1,7 @@ -API reference · ImplicitDifferentiation.jl

API reference

Index

Docstrings

ImplicitDifferentiation.AbstractLinearSolverType
AbstractLinearSolver

All linear solvers used within an ImplicitFunction must satisfy this interface.

It can be useful to roll out your own solver if you need more fine-grained control on convergence / speed / behavior in case of singularity. Check out the source code of IterativeLinearSolver and DirectLinearSolver for implementation examples.

Required methods

  • presolve(linear_solver, A, y): Returns a matrix-like object A for which it is cheaper to solve several linear systems with different vectors b (a typical example would be to perform LU factorization).
  • solve(linear_solver, A, b): Returns a vector x satisfying Ax = b. If the linear system has not been solved to satisfaction, every element of x should be a NaN of the appropriate floating point type.
source
ImplicitDifferentiation.DirectLinearSolverType
DirectLinearSolver

An implementation of AbstractLinearSolver using the built-in backslash operator.

Fields

  • verbose::Bool: Whether to throw a warning when the solver fails (defaults to true)
source
ImplicitDifferentiation.ImplicitFunctionType
ImplicitFunction{F,C,L,B}

Wrapper for an implicit function defined by a forward mapping y and a set of conditions c.

An ImplicitFunction object behaves like a function, and every call is differentiable with respect to the first argument x. When a derivative is queried, the Jacobian of y is computed using the implicit function theorem:

∂/∂y c(x, y(x)) * ∂/∂x y(x) = -∂/∂x c(x, y(x))

This requires solving a linear system A * J = -B where A = ∂c/∂y, B = ∂c/∂x and J = ∂y/∂x.

Fields

  • forward::F: a callable, does not need to be compatible with automatic differentiation
  • conditions::C: a callable, must be compatible with automatic differentiation
  • linear_solver::L: a subtype of AbstractLinearSolver, defines how the linear system will be solved
  • conditions_backend::B: either nothing or a subtype of AbstractDifferentiation.AbstractBackend, defines how the conditions will be differentiated within the implicit function theorem

There are two possible signatures for forward and conditions, which must be consistent with one another:

  1. Standard: forward(x, args...; kwargs...) = y and conditions(x, y, args...; kwargs...) = c
  2. Byproduct: forward(x, args...; kwargs...) = (y, z) and conditions(x, y, z, args...; kwargs...) = c.

In both cases, x, y and c must be arrays, with size(y) = size(c). In the second case, the byproduct z can be an arbitrary object generated by forward. The positional arguments args... and keyword arguments kwargs... must be the same for both forward and conditions.

Warning

The byproduct z and the other positional arguments args... beyond x are considered constant for differentiation purposes.

source
ImplicitDifferentiation.ImplicitFunctionMethod
(implicit::ImplicitFunction)(x::AbstractArray, args...; kwargs...)

Return implicit.forward(x, args...; kwargs...), which can be either an array y or a tuple (y, z).

This call is differentiable.

source
ImplicitDifferentiation.ImplicitFunctionMethod
ImplicitFunction(
+API reference · ImplicitDifferentiation.jl

API reference

Index

Docstrings

ImplicitDifferentiation.AbstractLinearSolverType
AbstractLinearSolver

All linear solvers used within an ImplicitFunction must satisfy this interface.

It can be useful to roll out your own solver if you need more fine-grained control on convergence / speed / behavior in case of singularity. Check out the source code of IterativeLinearSolver and DirectLinearSolver for implementation examples.

Required methods

  • presolve(linear_solver, A, y): Returns a matrix-like object A for which it is cheaper to solve several linear systems with different vectors b (a typical example would be to perform LU factorization).
  • solve(linear_solver, A, b): Returns a vector x satisfying Ax = b. If the linear system has not been solved to satisfaction, every element of x should be a NaN of the appropriate floating point type.
source
ImplicitDifferentiation.DirectLinearSolverType
DirectLinearSolver

An implementation of AbstractLinearSolver using the built-in backslash operator.

Fields

  • verbose::Bool: Whether to throw a warning when the solver fails (defaults to true)
source
ImplicitDifferentiation.ImplicitFunctionType
ImplicitFunction{F,C,L,B}

Wrapper for an implicit function defined by a forward mapping y and a set of conditions c.

An ImplicitFunction object behaves like a function, and every call is differentiable with respect to the first argument x. When a derivative is queried, the Jacobian of y is computed using the implicit function theorem:

∂/∂y c(x, y(x)) * ∂/∂x y(x) = -∂/∂x c(x, y(x))

This requires solving a linear system A * J = -B where A = ∂c/∂y, B = ∂c/∂x and J = ∂y/∂x.

Fields

  • forward::F: a callable, does not need to be compatible with automatic differentiation
  • conditions::C: a callable, must be compatible with automatic differentiation
  • linear_solver::L: a subtype of AbstractLinearSolver, defines how the linear system will be solved
  • conditions_backend::B: either nothing or a subtype of AbstractDifferentiation.AbstractBackend, defines how the conditions will be differentiated within the implicit function theorem

There are two possible signatures for forward and conditions, which must be consistent with one another:

  1. Standard: forward(x, args...; kwargs...) = y and conditions(x, y, args...; kwargs...) = c
  2. Byproduct: forward(x, args...; kwargs...) = (y, z) and conditions(x, y, z, args...; kwargs...) = c.

In both cases, x, y and c must be arrays, with size(y) = size(c). In the second case, the byproduct z can be an arbitrary object generated by forward. The positional arguments args... and keyword arguments kwargs... must be the same for both forward and conditions.

Warning

The byproduct z and the other positional arguments args... beyond x are considered constant for differentiation purposes.

source
ImplicitDifferentiation.ImplicitFunctionMethod
(implicit::ImplicitFunction)(x::AbstractArray, args...; kwargs...)

Return implicit.forward(x, args...; kwargs...), which can be either an array y or a tuple (y, z).

This call is differentiable.

source
ChainRulesCore.rruleFunction
rrule(rc, implicit, x, args...; kwargs...)

Custom reverse rule for an ImplicitFunction, to ensure compatibility with reverse mode autodiff.

This is only available if ChainRulesCore.jl is loaded (extension), except on Julia < 1.9 where it is always available.

We compute the vector-Jacobian product Jᵀv by solving Aᵀu = v and setting Jᵀv = -Bᵀu. Positional and keyword arguments are passed to both implicit.forward and implicit.conditions.

source
+)

Construct an ImplicitFunction with default parameters.

source
ChainRulesCore.rruleFunction
rrule(rc, implicit, x, args...; kwargs...)

Custom reverse rule for an ImplicitFunction, to ensure compatibility with reverse mode autodiff.

This is only available if ChainRulesCore.jl is loaded (extension), except on Julia < 1.9 where it is always available.

We compute the vector-Jacobian product Jᵀv by solving Aᵀu = v and setting Jᵀv = -Bᵀu. Positional and keyword arguments are passed to both implicit.forward and implicit.conditions.

source
diff --git a/dev/examples/0_intro/index.html b/dev/examples/0_intro/index.html index a75f1af..2946788 100644 --- a/dev/examples/0_intro/index.html +++ b/dev/examples/0_intro/index.html @@ -1,5 +1,5 @@ -Introduction · ImplicitDifferentiation.jl

Introduction

We explain the basics of our package on a simple function that is not amenable to naive automatic differentiation.

using ForwardDiff
+Introduction · ImplicitDifferentiation.jl

Introduction

We explain the basics of our package on a simple function that is not amenable to naive automatic differentiation.

using ForwardDiff
 using ImplicitDifferentiation
 using LinearAlgebra
 using Random
@@ -27,4 +27,4 @@
     forward, conditions; linear_solver=DirectLinearSolver()
 )
ImplicitFunction(forward, conditions, DirectLinearSolver(true), nothing)

Then the Jacobian itself is differentiable.

h = rand(2)
 J_Z(t) = Zygote.jacobian(implicit_higher_order, x .+ t .* h)[1]
-ForwardDiff.derivative(J_Z, 0) ≈ Diagonal((-0.25 .* h) ./ (x .^ 1.5))
true

This page was generated using Literate.jl.

+ForwardDiff.derivative(J_Z, 0) ≈ Diagonal((-0.25 .* h) ./ (x .^ 1.5))
true

This page was generated using Literate.jl.

diff --git a/dev/examples/1_basic/index.html b/dev/examples/1_basic/index.html index 48fe750..9fc669b 100644 --- a/dev/examples/1_basic/index.html +++ b/dev/examples/1_basic/index.html @@ -1,5 +1,5 @@ -Basic use cases · ImplicitDifferentiation.jl

Basic use cases

We show how to differentiate through very common routines:

  • an unconstrained optimization problem
  • a nonlinear system of equations
  • a fixed point iteration
using ForwardDiff
+Basic use cases · ImplicitDifferentiation.jl

Basic use cases

We show how to differentiate through very common routines:

  • an unconstrained optimization problem
  • a nonlinear system of equations
  • a fixed point iteration
using ForwardDiff
 using ImplicitDifferentiation
 using LinearAlgebra
 using NLsolve
@@ -71,4 +71,4 @@
     Zygote.jacobian(_x -> forward_fixedpoint(_x; iterations=10), x)[1]
 catch e
     e
-end
ErrorException("Mutating arrays is not supported -- called copyto!(Vector{Float64}, ...)\nThis error occurs when you ask Zygote to differentiate operations that change\nthe elements of arrays in place (e.g. setting values with x .= ...)\n\nPossible fixes:\n- avoid mutating operations (preferred)\n- or read the documentation and solutions for this error\n  https://fluxml.ai/Zygote.jl/latest/limitations\n")

This page was generated using Literate.jl.

+end
ErrorException("Mutating arrays is not supported -- called copyto!(Vector{Float64}, ...)\nThis error occurs when you ask Zygote to differentiate operations that change\nthe elements of arrays in place (e.g. setting values with x .= ...)\n\nPossible fixes:\n- avoid mutating operations (preferred)\n- or read the documentation and solutions for this error\n  https://fluxml.ai/Zygote.jl/latest/limitations\n")

This page was generated using Literate.jl.

diff --git a/dev/examples/2_advanced/index.html b/dev/examples/2_advanced/index.html index 03d9da4..ae19037 100644 --- a/dev/examples/2_advanced/index.html +++ b/dev/examples/2_advanced/index.html @@ -1,5 +1,5 @@ -Advanced use cases · ImplicitDifferentiation.jl

Advanced use cases

We dive into more advanced applications of implicit differentiation:

  • constrained optimization problems
using ForwardDiff
+Advanced use cases · ImplicitDifferentiation.jl

Advanced use cases

We dive into more advanced applications of implicit differentiation:

  • constrained optimization problems
using ForwardDiff
 using ImplicitDifferentiation
 using LinearAlgebra
 using Optim
@@ -25,8 +25,8 @@
 end
conditions_cstr_optim (generic function with 1 method)

We now have all the ingredients to construct our implicit function.

implicit_cstr_optim = ImplicitFunction(forward_cstr_optim, conditions_cstr_optim)
ImplicitFunction(forward_cstr_optim, conditions_cstr_optim, IterativeLinearSolver(true), nothing)

And indeed, it behaves as it should when we call it:

x = rand(2) .+ [0, 1]
2-element Vector{Float64}:
  0.22442135286865494
  1.3267275094228514

The second component of $x$ is $> 1$, so its square root will be thresholded to one, and the corresponding derivative will be $0$.

implicit_cstr_optim(x) .^ 2
2-element Vector{Float64}:
- 0.22442135286146742
- 0.9999999995782778
J_thres = Diagonal([0.5 / sqrt(x[1]), 0])
2×2 LinearAlgebra.Diagonal{Float64, Vector{Float64}}:
+ 0.22442135289492518
+ 0.999999999578276
J_thres = Diagonal([0.5 / sqrt(x[1]), 0])
2×2 LinearAlgebra.Diagonal{Float64, Vector{Float64}}:
  1.05545   ⋅ 
   ⋅       0.0

Forward mode autodiff

ForwardDiff.jacobian(implicit_cstr_optim, x)
2×2 Matrix{Float64}:
  1.05545  0.0
@@ -38,4 +38,4 @@
     Zygote.jacobian(forward_cstr_optim, x)[1]
 catch e
     e
-end
Zygote.CompileError(Tuple{typeof(Optim.optimize), NLSolversBase.OnceDifferentiable{Float64, Vector{Float64}, Vector{Float64}}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Optim.Fminbox{Optim.GradientDescent{LineSearches.InitialPrevious{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Optim.var"#13#15"}, Float64, Optim.var"#49#51"}, Optim.Options{Float64, Nothing}}, ErrorException("try/catch is not supported.\nRefer to the Zygote documentation for fixes.\nhttps://fluxml.ai/Zygote.jl/latest/limitations\n"))

This page was generated using Literate.jl.

+end
Zygote.CompileError(Tuple{typeof(Optim.optimize), NLSolversBase.OnceDifferentiable{Float64, Vector{Float64}, Vector{Float64}}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Optim.Fminbox{Optim.GradientDescent{LineSearches.InitialPrevious{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Optim.var"#13#15"}, Float64, Optim.var"#49#51"}, Optim.Options{Float64, Nothing}}, ErrorException("try/catch is not supported.\nRefer to the Zygote documentation for fixes.\nhttps://fluxml.ai/Zygote.jl/latest/limitations\n"))

This page was generated using Literate.jl.

diff --git a/dev/examples/3_tricks/index.html b/dev/examples/3_tricks/index.html index 7bff6ff..887eb8b 100644 --- a/dev/examples/3_tricks/index.html +++ b/dev/examples/3_tricks/index.html @@ -1,5 +1,5 @@ -Tricks · ImplicitDifferentiation.jl

Tricks

We demonstrate several features that may come in handy for some users.

using ComponentArrays
+Tricks · ImplicitDifferentiation.jl

Tricks

We demonstrate several features that may come in handy for some users.

using ComponentArrays
 using ForwardDiff
 using ImplicitDifferentiation
 using Krylov
@@ -54,4 +54,4 @@
  0.0      0.0       0.69831
Zygote.jacobian(first ∘ implicit_byproduct, x)[1]
3×3 Matrix{Float64}:
   1.17334  -0.0       -0.0
  -0.0       0.759508  -0.0
- -0.0      -0.0        0.69831

This page was generated using Literate.jl.

+ -0.0 -0.0 0.69831

This page was generated using Literate.jl.

diff --git a/dev/faq/index.html b/dev/faq/index.html index 151b910..6be3e53 100644 --- a/dev/faq/index.html +++ b/dev/faq/index.html @@ -1,4 +1,4 @@ -FAQ · ImplicitDifferentiation.jl

Frequently Asked Questions

Supported autodiff backends

To differentiate an ImplicitFunction, the following backends are supported.

BackendForward modeReverse mode
ForwardDiff.jlyes-
ChainRules.jl-compatibleyessoon
Enzyme.jlsomedaysomeday

By default, the conditions are differentiated with the same backend as the ImplicitFunction that contains them. However, this can be switched to any backend compatible with AbstractDifferentiation.jl (i.e. a subtype of AD.AbstractBackend). You can specify it with the conditions_backend keyword argument when constructing an ImplicitFunction.

Warning

At the moment, conditions_backend can only be nothing or AD.ForwardDiffBackend(). We are investigating why the other backends fail.

Input and output types

Arrays

Functions that eat or spit out arbitrary arrays are supported, as long as the forward mapping and conditions return arrays of the same size.

If the output is a small array (say, less than 100 elements), consider using StaticArrays.jl for increased performance.

Scalars

Functions that eat or spit out a single number are not supported. The forward mapping and conditions need arrays: instead of returning val you should return [val] (a 1-element Vector). Or better yet, wrap it in a static vector: SVector(val).

Sparse arrays

Danger

Sparse arrays are not supported and might give incorrect values or NaNs!

Number of inputs and outputs

Most of the documentation is written for the simple case where the forward mapping is x -> y, i.e. one input and one output. What can you do to handle multiple inputs or outputs? Well, it depends whether you want their derivatives or not.

Derivatives neededDerivatives not needed
Multiple inputsMake x a ComponentVectorSupply args and kwargs to forward
Multiple outputsMake y and c two ComponentVectorsLet forward return a byproduct

We now detail each of these options.

Multiple inputs or outputs | Derivatives needed

Say your forward mapping takes multiple input arrays and returns multiple output arrays, such that you want derivatives for all of them.

The trick is to leverage ComponentArrays.jl to wrap all the inputs inside a single a ComponentVector, and do the same for all the outputs. See the examples for a demonstration.

Warning

You may run into issues trying to differentiate through the ComponentVector constructor. For instance, Zygote.jl will throw ERROR: Mutating arrays is not supported. Check out this issue for a dirty workaround involving custom chain rules for the constructor.

Multiple inputs | Derivatives not needed

If your forward mapping (or conditions) takes multiple inputs but you don't care about derivatives, then you can add further positional and keyword arguments beyond x. It is important to make sure that the forward mapping and conditions accept the same set of arguments, even if each of these functions only uses a subset of them.

forward(x, arg1, arg2; kwarg1, kwarg2) = y
+FAQ · ImplicitDifferentiation.jl

Frequently Asked Questions

Supported autodiff backends

To differentiate an ImplicitFunction, the following backends are supported.

BackendForward modeReverse mode
ForwardDiff.jlyes-
ChainRules.jl-compatibleyessoon
Enzyme.jlsomedaysomeday

By default, the conditions are differentiated with the same backend as the ImplicitFunction that contains them. However, this can be switched to any backend compatible with AbstractDifferentiation.jl (i.e. a subtype of AD.AbstractBackend). You can specify it with the conditions_backend keyword argument when constructing an ImplicitFunction.

Warning

At the moment, conditions_backend can only be nothing or AD.ForwardDiffBackend(). We are investigating why the other backends fail.

Input and output types

Arrays

Functions that eat or spit out arbitrary arrays are supported, as long as the forward mapping and conditions return arrays of the same size.

If the output is a small array (say, less than 100 elements), consider using StaticArrays.jl for increased performance.

Scalars

Functions that eat or spit out a single number are not supported. The forward mapping and conditions need arrays: instead of returning val you should return [val] (a 1-element Vector). Or better yet, wrap it in a static vector: SVector(val).

Sparse arrays

Danger

Sparse arrays are not supported and might give incorrect values or NaNs!

Number of inputs and outputs

Most of the documentation is written for the simple case where the forward mapping is x -> y, i.e. one input and one output. What can you do to handle multiple inputs or outputs? Well, it depends whether you want their derivatives or not.

Derivatives neededDerivatives not needed
Multiple inputsMake x a ComponentVectorSupply args and kwargs to forward
Multiple outputsMake y and c two ComponentVectorsLet forward return a byproduct

We now detail each of these options.

Multiple inputs or outputs | Derivatives needed

Say your forward mapping takes multiple input arrays and returns multiple output arrays, such that you want derivatives for all of them.

The trick is to leverage ComponentArrays.jl to wrap all the inputs inside a single a ComponentVector, and do the same for all the outputs. See the examples for a demonstration.

Warning

You may run into issues trying to differentiate through the ComponentVector constructor. For instance, Zygote.jl will throw ERROR: Mutating arrays is not supported. Check out this issue for a dirty workaround involving custom chain rules for the constructor.

Multiple inputs | Derivatives not needed

If your forward mapping (or conditions) takes multiple inputs but you don't care about derivatives, then you can add further positional and keyword arguments beyond x. It is important to make sure that the forward mapping and conditions accept the same set of arguments, even if each of these functions only uses a subset of them.

forward(x, arg1, arg2; kwarg1, kwarg2) = y
 conditions(x, arg1, arg2; kwarg1, kwarg2) = c

All of the positional and keyword arguments apart from x will get zero tangents during differentiation of the implicit function.

Multiple outputs | Derivatives not needed

The last and most tricky situation is when your forward mapping returns multiple outputs, but you only care about some of their derivatives. Then, you need to group the objects you don't want to differentiate into a "byproduct" z, returned alongside the actual output y. This way, derivatives of z will not be computed: the byproduct is considered constant during differentiation.

The signatures of your functions will need to be be slightly different from the previous cases:

forward(x, arg1, arg2; kwarg1, kwarg2) = (y, z)
-conditions(x, y, z, arg1, arg2; kwarg1, kwarg2) =  c

See the examples for a demonstration.

This is mainly useful when the solution procedure creates objects such as Jacobians, which we want to reuse when computing or differentiating the conditions. In that case, you may want to write the conditions differentiation rules yourself. A more advanced application is given by DifferentiableFrankWolfe.jl.

Modeling tips

Writing conditions

We recommend that the conditions themselves do not involve calls to autodiff, even when they describe a gradient. Otherwise, you will need to make sure that nested autodiff works well in your case. For instance, if you're differentiating your implicit function (and your conditions) in reverse mode with Zygote.jl, you may want to use ForwardDiff.jl mode to compute gradients inside the conditions.

Dealing with constraints

To express constrained optimization problems as implicit functions, you might need differentiable projections or proximal operators to write the optimality conditions. See Efficient and modular implicit differentiation for precise formulations.

In case these operators are too complicated to code them yourself, here are a few places you can look:

An alternative is differentiating through the KKT conditions, which is exactly what DiffOpt.jl does for JuMP models.

+conditions(x, y, z, arg1, arg2; kwarg1, kwarg2) = c

See the examples for a demonstration.

This is mainly useful when the solution procedure creates objects such as Jacobians, which we want to reuse when computing or differentiating the conditions. In that case, you may want to write the conditions differentiation rules yourself. A more advanced application is given by DifferentiableFrankWolfe.jl.

Modeling tips

Writing conditions

We recommend that the conditions themselves do not involve calls to autodiff, even when they describe a gradient. Otherwise, you will need to make sure that nested autodiff works well in your case. For instance, if you're differentiating your implicit function (and your conditions) in reverse mode with Zygote.jl, you may want to use ForwardDiff.jl mode to compute gradients inside the conditions.

Dealing with constraints

To express constrained optimization problems as implicit functions, you might need differentiable projections or proximal operators to write the optimality conditions. See Efficient and modular implicit differentiation for precise formulations.

In case these operators are too complicated to code them yourself, here are a few places you can look:

An alternative is differentiating through the KKT conditions, which is exactly what DiffOpt.jl does for JuMP models.

diff --git a/dev/index.html b/dev/index.html index 82043c4..d963d26 100644 --- a/dev/index.html +++ b/dev/index.html @@ -1,2 +1,2 @@ -Home · ImplicitDifferentiation.jl

ImplicitDifferentiation.jl

Stable Dev Build Status Coverage Code Style: Blue Aqua QA

ImplicitDifferentiation.jl is a package for automatic differentiation of functions defined implicitly, i.e., forward mappings

\[x \in \mathbb{R}^n \longmapsto y(x) \in \mathbb{R}^m\]

whose output is defined by conditions

\[c(x,y(x)) = 0 \in \mathbb{R}^m\]

Background

Implicit differentiation is useful to differentiate through two types of functions:

  • Those for which automatic differentiation fails. Reasons can vary depending on your backend, but the most common include calls to external solvers, mutating operations or type restrictions.
  • Those for which automatic differentiation is very slow. A common example is iterative procedures like fixed point equations or optimization algorithms.

If you just need a quick overview, check out our JuliaCon 2022 talk. If you want a deeper dive into the theory, you can refer to the paper Efficient and modular implicit differentiation by Blondel et al. (2022).

Getting started

To install the stable version, open a Julia REPL and run:

julia> using Pkg; Pkg.add("ImplicitDifferentiation")

For the latest version, run this instead:

julia> using Pkg; Pkg.add(url="https://github.com/gdalle/ImplicitDifferentiation.jl")

Please read the documentation, especially the examples and FAQ.

In Julia:

In Python:

  • google/jaxopt: hardware accelerated, batchable and differentiable optimizers in JAX
+Home · ImplicitDifferentiation.jl

ImplicitDifferentiation.jl

Stable Dev Build Status Coverage Code Style: Blue Aqua QA

ImplicitDifferentiation.jl is a package for automatic differentiation of functions defined implicitly, i.e., forward mappings

\[x \in \mathbb{R}^n \longmapsto y(x) \in \mathbb{R}^m\]

whose output is defined by conditions

\[c(x,y(x)) = 0 \in \mathbb{R}^m\]

Background

Implicit differentiation is useful to differentiate through two types of functions:

  • Those for which automatic differentiation fails. Reasons can vary depending on your backend, but the most common include calls to external solvers, mutating operations or type restrictions.
  • Those for which automatic differentiation is very slow. A common example is iterative procedures like fixed point equations or optimization algorithms.

If you just need a quick overview, check out our JuliaCon 2022 talk. If you want a deeper dive into the theory, you can refer to the paper Efficient and modular implicit differentiation by Blondel et al. (2022).

Getting started

To install the stable version, open a Julia REPL and run:

julia> using Pkg; Pkg.add("ImplicitDifferentiation")

For the latest version, run this instead:

julia> using Pkg; Pkg.add(url="https://github.com/gdalle/ImplicitDifferentiation.jl")

Please read the documentation, especially the examples and FAQ.

In Julia:

In Python:

  • google/jaxopt: hardware accelerated, batchable and differentiable optimizers in JAX
diff --git a/dev/search/index.html b/dev/search/index.html index e1c1e12..e5d6997 100644 --- a/dev/search/index.html +++ b/dev/search/index.html @@ -1,2 +1,2 @@ -Search · ImplicitDifferentiation.jl

Loading search...

    +Search · ImplicitDifferentiation.jl

    Loading search...