ADocumentation Index
Fetch the complete documentation index at: https://mintlify.com/itsubaki/autograd/llms.txt
Use this file to discover all available pages before exploring further.
Variable wraps a *tensor.Tensor[float64] and adds the metadata needed for automatic differentiation: a gradient field, a pointer to the function that created it, and a generation counter used to order the backward pass.
The Variable struct
| Field | Type | Description |
|---|---|---|
Name | string | Optional label used in the String() output. |
Data | *tensor.Tensor[float64] | The underlying numerical data. |
Grad | *Variable | Gradient with respect to some scalar loss. Set by Backward(). |
Creator | *Function | The function that produced this variable. nil for leaf variables. |
Generation | int | Depth in the computation graph. Used to order backward traversal. |
Creating variables
From literal values
From an existing tensor
Filled variables
Random variables
Rand and Randn accept an optional randv2.Source as the second argument for reproducible results.Inspecting shape and data
Reshaping
Reshape modifies the variable in place and returns it, so you can chain it with construction calls.
Naming variables
SettingName changes how a variable is printed, which is useful for debugging.
The Grad field
Grad starts as nil. After calling Backward() on a downstream variable, autograd accumulates the gradient into Grad.
Clearing gradients
Gradients accumulate across multipleBackward() calls. Call Cleargrad() before the next forward pass when reusing a variable.
Detaching from the graph
Unchain
Unchain() removes the direct link from a single variable to its creator function, stopping backward propagation at that variable.
UnchainBackward
UnchainBackward() walks the entire graph backwards from a variable and removes all creator links, detaching the complete subgraph.
Next steps
Functions
Learn how functions build the computation graph.
Automatic differentiation
Understand how Backward() traverses the graph.
Variable API reference
Full reference for the variable package.
Gradient descent guide
Put variables and gradients to work in an optimizer loop.