Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/itsubaki/autograd/llms.txt

Use this file to discover all available pages before exploring further.

autograd is a pure Go automatic differentiation library inspired by deep-learning-from-scratch-3. It lets you define mathematical functions over tensors and compute their gradients automatically — enabling gradient descent, neural network training, and higher-order differentiation entirely in Go.

Quickstart

Compute your first gradient in under five minutes.

Core Concepts

Understand variables, the computation graph, and backpropagation.

Guides

Step-by-step walkthroughs for gradient descent, neural networks, and more.

API Reference

Full reference for every package: variable, function, layer, model, optimizer.

Why autograd?

Pure Go

Zero external dependencies — only the Go standard library. Drop it into any Go project with a single go get.

Reverse-mode AD

Efficient backpropagation through arbitrarily deep computation graphs, including higher-order gradients.

Deep learning primitives

Built-in Linear, RNN, and LSTM layers, plus MLP and LSTM model types ready to train.

Multiple optimizers

SGD, Momentum, Adam, and AdamW — with hooks for weight decay and gradient clipping.

Get started

1

Install the package

Add autograd to your Go module:
go get github.com/itsubaki/autograd
2

Create a variable and compute a gradient

import (
    F "github.com/itsubaki/autograd/function"
    "github.com/itsubaki/autograd/variable"
)

x := variable.New(1.0)
y := F.Sin(x)
y.Backward()

fmt.Println(y)      // variable(0.8414709848078965)
fmt.Println(x.Grad) // variable(0.5403023058681398)
3

Train a model

Use built-in layers, models, and optimizers to train neural networks on your data. See the Deep Learning guide for a complete walkthrough.

Build docs developers (and LLMs) love