autograd is a pure Go automatic differentiation library inspired by deep-learning-from-scratch-3. It lets you define mathematical functions over tensors and compute their gradients automatically — enabling gradient descent, neural network training, and higher-order differentiation entirely in Go.Documentation Index
Fetch the complete documentation index at: https://mintlify.com/itsubaki/autograd/llms.txt
Use this file to discover all available pages before exploring further.
Quickstart
Compute your first gradient in under five minutes.
Core Concepts
Understand variables, the computation graph, and backpropagation.
Guides
Step-by-step walkthroughs for gradient descent, neural networks, and more.
API Reference
Full reference for every package: variable, function, layer, model, optimizer.
Why autograd?
Pure Go
Zero external dependencies — only the Go standard library. Drop it into any Go project with a single
go get.Reverse-mode AD
Efficient backpropagation through arbitrarily deep computation graphs, including higher-order gradients.
Deep learning primitives
Built-in Linear, RNN, and LSTM layers, plus MLP and LSTM model types ready to train.
Multiple optimizers
SGD, Momentum, Adam, and AdamW — with hooks for weight decay and gradient clipping.
Get started
Train a model
Use built-in layers, models, and optimizers to train neural networks on your data. See the Deep Learning guide for a complete walkthrough.