A few weeks ago, we released num-dual
0.12.0
num-dual
provides data types and helper functions for forward-mode automatic differentiation (AD) in Rust. Unlike reverse-mode AD (backpropagation), forward-mode AD doesn’t require a computational graph and can, therefore, be significantly faster when the number of input variables is moderate. It’s also easy to extend to higher-order derivatives.
The crate offers a simple interface for:
First derivatives (scalar, gradients, Jacobians)
Second derivatives (scalar, partial, Hessians, partial Hessians)
Third derivatives (scalar)
However, the underlying data structures are fully recursive, so you can calculate derivatives up to any order.
Vector-valued derivatives are calculated based on data structures from …
A few weeks ago, we released num-dual
0.12.0
num-dual
provides data types and helper functions for forward-mode automatic differentiation (AD) in Rust. Unlike reverse-mode AD (backpropagation), forward-mode AD doesn’t require a computational graph and can, therefore, be significantly faster when the number of input variables is moderate. It’s also easy to extend to higher-order derivatives.
The crate offers a simple interface for:
First derivatives (scalar, gradients, Jacobians)
Second derivatives (scalar, partial, Hessians, partial Hessians)
Third derivatives (scalar)
However, the underlying data structures are fully recursive, so you can calculate derivatives up to any order.
Vector-valued derivatives are calculated based on data structures from nalgebra
. If statically sized vectors can be used in a given problem, no allocations are required leading to tremendous computational efficiencies.
New in v0.12.0: Implicit automatic differentiation!
Implicit differentiation computes derivatives where y
is defined implicitly by an equation f(x, y) = 0
. Automatic implicit differentiation generalizes this concept to obtain the full derivative information for y
(with respect to any input variables).
Now num-dual
will not actually solve the nonlinear equation f(x, y) = 0
for you. This step still requires a nonlinear equation solver or optimizer (e.g., argmin). The automatic implicit differentiation will calculate derivatives for a given “real” part (i.e., no derivative information) of y
.
Of course that makes automatic differentiation and nonlinear solving/optimization a perfect match. I demonstrate that in the ipopt-ad crate that turns the powerful NLP (constrained optimization) solver IPOPT into a black-box optimizer, i.e., it only requires a function that returns the values of the optimization variable and constraints), without any repercussions regarding the robustness or speed of convergence of the solver.
I tried an integration with argmin
, however, could not overcome the extreme genericness that seemed to only be interfaciable with ndarray
data structure and not nalgebra
. Any guidance here is welcome!
Aside from that we are interested about any feedback or contributions!