Automatic Differentiation: Inverse Accumulation Mode
We show that, under certain circumstances, it is possible to automatically compute Jacobian-inverse-vector and Jacobian-inverse-transpose-vector products about as efficiently as Jacobian-vector and Jacobian-transpose-vector products. The key insight is to notice that the Jacobian corresponding to the use of one basis function is of a form whose sparsity is invariant to inversion. The main restriction of the method is a constraint on the number of active variables, which suggests a variety of techniques or generalization to allow the constraint to be enforced or relaxed. This technique has the potential to allow the efficient direct calculation of Newton steps as well as other numeric calculations of interest.
💡 Research Summary
The paper introduces a novel variant of automatic differentiation (AD) called Inverse Accumulation Mode, which enables the efficient computation of Jacobian‑inverse‑vector (J⁻¹·v) and Jacobian‑inverse‑transpose‑vector (J⁻ᵀ·v) products with essentially the same cost as the standard forward (J·v) and reverse (Jᵀ·v) modes. The authors begin by reviewing the classic forward and reverse AD formulations, which compute tangent and cotangent propagations through a sequence of elementary operations represented as a data‑flow graph. They then pose the problem of solving for starred vectors in the equations
\
Comments & Academic Discussion
Loading comments...
Leave a Comment