TheDocumentation Index
Fetch the complete documentation index at: https://mintlify.com/itsubaki/autograd/llms.txt
Use this file to discover all available pages before exploring further.
numerical package computes approximate derivatives using the central finite difference method. Its primary use is gradient checking — comparing numerically approximated gradients against the analytical gradients produced by autograd’s backward pass to verify that a custom function or layer is correctly implemented.
Func type
Func is the function signature accepted by Diff. Any function that maps one or more variables to a single output variable satisfies this type.
All standard library functions (F.Sin, F.Mul, F.MatMul, F.MeanSquaredError, etc.) satisfy Func.
Diff
f with respect to each element of x using the central difference formula:
The function to differentiate.
The point at which to evaluate the derivative. Each variable in the slice is shifted by
h independently.Step size for the finite difference approximation. Defaults to
1e-4 if omitted. Smaller values reduce truncation error but increase floating-point rounding error.A variable whose data holds the numerically computed derivative values. Gradients are not tracked on the returned variable.
Diff evaluates f twice (at x + h and x - h) and does not call Backward. It produces a raw numerical approximation, not a computation graph node.Use case: gradient checking
Gradient checking compares the numerical derivative fromDiff against the analytical gradient from Backward. A large discrepancy indicates a bug in the backward function of a custom operation.
Typical workflow:
- Compute analytical gradients via
y.Backward(). - Compute numerical gradients via
numerical.Diff(f, x). - Compare element-wise — values should agree to roughly
1e-4or better.
Examples
Numerical gradient of sin
Gradient check for a custom function
Using a custom step size
See also
- function package — built-in differentiable functions
- Concepts: autograd — how analytical gradients are computed
- Guides: higher-order gradients — advanced gradient topics