@hackage ad1.3.0.1

Automatic Differentiation

Forward-, reverse- and mixed- mode automatic differentiation combinators with a common API.

Type-level "branding" is used to both prevent the end user from confusing infinitesimals and to limit unsafe access to the implementation details of each Mode.

Each mode has a separate module full of combinators.

  • Numeric.AD.Mode.Forward provides basic forward-mode AD. It is good for computing simple derivatives.

  • Numeric.AD.Mode.Reverse uses benign side-effects to compute reverse-mode AD. It is good for computing gradients in one pass.

  • Numeric.AD.Mode.Sparse computes a sparse forward-mode AD tower. It is good for higher derivatives or large numbers of outputs.

  • Numeric.AD.Mode.Tower computes a dense forward-mode AD tower useful for higher derivatives of single input functions.

  • Numeric.AD.Mode.Mixed computes using whichever mode or combination thereof is suitable to each individual combinator. This mode is the default, re-exported by Numeric.AD

While not every mode can provide all operations, the following basic operations are supported, modified as appropriate by the suffixes below:

  • grad computes the gradient (partial derivatives) of a function at a point.

  • jacobian computes the Jacobian matrix of a function at a point.

  • diff computes the derivative of a function at a point.

  • du computes a directional derivative of a function at a point.

  • hessian computes the Hessian matrix (matrix of second partial derivatives) of a function at a point.

The following suffixes alter the meanings of the functions above as follows:

  • ' -- also return the answer

  • With lets the user supply a function to blend the input with the output

  • F is a version of the base function lifted to return a Traversable (or Functor) result

  • s means the function returns all higher derivatives in a list or f-branching Stream

  • T means the result is transposed with respect to the traditional formulation.

  • 0 means that the resulting derivative list is padded with 0s at the end.

Changes since 1.3

  • Dependency bump to be compatible with ghc 7.4.1

Changes since 1.2

  • Compiles with template haskell 2.6, changed interface to comply with the lack of Eq and Show as superclasses of Num in the new GHC.

Changes since 1.1.3

  • Made primal calculations strict where possible.

Changes since 1.1.0

  • Introduced a much faster topological sort into the reverse mode AD implementation by Anthony Cowley. This fixes a space leak and a stack overflow problem on very large (>2000 variable) problem sets.

  • Made bound calculations in reverse mode more strict.

Changes since 1.0.0

  • Changed the way Show was derived to comply with changes in instance resolution in ghc >= 7.0 && <= 7.1

Changes since 0.45.0

  • Converted Stream to use the external comonad package

Changes since 0.44.5

  • Added Halley's method

Changes since 0.40.0

  • Fixed bug fix for (/) :: (Mode s, Fractional a) => AD s a

  • Improved documentation

  • Regularized naming conventions

  • Exposed Id, probe, and lower methods via Numeric.AD.Types

  • Removed monadic combinators

  • Retuned the Mixed mode jacobian calculations to only require a Functor-based result.

  • Added unsafe variadic vgrad, vgrad', and vgrads combinators