Module multidual

Module multidual 

Source
Expand description

Multi-component dual numbers for multivariable automatic differentiation.

A multi-component dual number tracks a value and multiple partial derivatives simultaneously, enabling computation of gradients in a single forward pass.

§Mathematical Background

For a function f: ℝⁿ → ℝ, MultiDual<T, N> represents a value and its gradient ∇f = [∂f/∂x₁, ∂f/∂x₂, …, ∂f/∂xₙ] simultaneously.

Arithmetic operations extend naturally from single-variable dual numbers:

  • (a + ∇a) + (b + ∇b) = (a+b) + (∇a+∇b)
  • -(a + ∇a) = -a + (-∇a)
  • (a + ∇a) - (b + ∇b) = (a-b) + (∇a-∇b)
  • (a + ∇a) * (b + ∇b) = ab + (b∇a + a∇b)
  • 1/(b + ∇b) = (1/b) + (-∇b/b²)

Each operation updates all N derivative components at once, computing the full gradient in a single pass through the computation.

§Example

use autodiff::{MultiDual, gradient};

// Compute ∇f for f(x, y) = x² + 2xy + y² at (3, 4)
let f = |vars: [MultiDual<f64, 2>; 2]| {
    let [x, y] = vars;
    let two = MultiDual::constant(2.0);
    x * x + two * x * y + y * y
};

let point = [3.0, 4.0];
let (value, grad) = gradient(f, point);

assert_eq!(value, 49.0);   // f(3, 4) = 9 + 24 + 16 = 49
assert_eq!(grad[0], 14.0); // ∂f/∂x = 2x + 2y = 14
assert_eq!(grad[1], 14.0); // ∂f/∂y = 2x + 2y = 14

§Efficiency

Computing the gradient requires 1 forward pass with MultiDual<T, N>, compared to n passes if using Dual<T> to compute each partial derivative separately. For n=10 inputs, this is a 10x speedup.

§Use Cases

  • Gradient-based optimization (gradient descent, Newton’s method)
  • Neural network backpropagation alternatives
  • Sensitivity analysis with multiple parameters
  • Scientific computing with multivariable functions

Structs§

MultiDual
A multi-component dual number representing a value and N partial derivatives.

Functions§

gradient
Compute the gradient of a scalar multivariable function in a single forward pass.