Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/getcompanion-ai/feynman/llms.txt

Use this file to discover all available pages before exploring further.

This page covers every Feynman CLI command and flag. Research workflow commands like feynman deepresearch are also documented in the Slash Commands reference, since they map directly to REPL slash commands.

Core commands

CommandDescription
feynmanLaunch the interactive REPL
feynman chat [prompt]Start chat explicitly, optionally with an initial prompt
feynman helpShow CLI help
feynman setupRun the guided setup wizard
feynman doctorDiagnose config, auth, Pi runtime, and preview dependencies
feynman statusShow the current setup summary (model, auth, packages)
Run feynman setup first on a new machine. Feynman will also auto-launch setup if no model is configured and stdin is a TTY.

Model management

CommandDescription
feynman model listList available models in Pi auth storage
feynman model login [id]Login to a Pi OAuth model provider
feynman model logout [id]Logout from a Pi OAuth model provider
feynman model set <provider/model>Set the default model for all sessions
The model set command writes the new default to ~/.feynman/settings.json. The format is provider/model-name:
feynman model set anthropic/claude-sonnet-4-5
feynman model set openai/gpt-5
To see all models you have configured and their authentication status:
feynman model list

AlphaXiv commands

CommandDescription
feynman alpha loginSign in to alphaXiv
feynman alpha logoutClear alphaXiv auth
feynman alpha statusCheck alphaXiv auth status
AlphaXiv authentication enables Feynman to search and retrieve papers, access discussion threads, and pull citation metadata. Once authenticated, the alpha tools are available inside the REPL for paper search, Q&A, and code inspection.

Package management

CommandDescription
feynman packages listShow core and optional Pi package presets with install status
feynman packages install <preset>Install an optional package preset
feynman update [package]Update installed packages, or a specific package by name
Use feynman packages list to see which optional packages are available and which are already installed. Pass a specific package name to feynman update to update only that package. See Packages for the full list of presets.
feynman packages install generative-ui
feynman update
feynman update pi-subagents

Utility commands

CommandDescription
feynman search statusShow Pi web-access status and config path

Workflow commands

All research workflow slash commands can also be invoked directly from the CLI. Feynman translates them into the corresponding REPL slash command on launch:
CommandDescription
feynman deepresearch <topic>Run a thorough, source-heavy investigation and produce a research brief with inline citations
feynman lit <topic>Run a literature review using paper search and primary-source synthesis
feynman review <artifact>Simulate an AI research peer review with likely objections, severity, and a revision plan
feynman audit <item>Compare a paper’s claims against its public codebase for mismatches and reproducibility risks
feynman replicate <paper>Plan or execute a replication workflow for a paper, claim, or benchmark
feynman compare <topic>Compare multiple sources and produce a source-grounded agreement/disagreement matrix
feynman draft <topic>Turn research findings into a polished paper-style draft
feynman autoresearch <idea>Autonomous experiment loop — try ideas, measure results, repeat
feynman watch <topic>Set up a recurring or deferred research watch on a topic
feynman deepresearch "mechanistic interpretability in transformers"
feynman lit "diffusion models for protein folding"
feynman review outputs/my-paper.md
feynman audit 2401.12345
feynman replicate "GPT-4 MMLU benchmark"
feynman compare "LoRA vs full fine-tuning"
feynman draft "scaling laws for retrieval-augmented generation"
feynman watch "llm reasoning benchmarks"
feynman autoresearch "minimize bundle size of my webapp"
These are equivalent to launching the REPL and typing the corresponding slash command. The CLI form is useful for scripting and automation.

Flags

FlagDescription
--prompt "<text>"Run one prompt and exit (one-shot mode)
--model <provider:model>Force a specific model for this session
--thinking <level>Set thinking level: off, minimal, low, medium, high, xhigh
--cwd <path>Set the working directory for all file operations
--session-dir <path>Set the session storage directory
--new-sessionStart a new persisted session
--alpha-loginSign in to alphaXiv and exit
--alpha-logoutClear alphaXiv auth and exit
--alpha-statusShow alphaXiv auth status and exit
--doctorAlias for feynman doctor
--setup-previewInstall preview dependencies (pandoc)

Thinking levels

The --thinking flag (and FEYNMAN_THINKING env var) controls how much extended reasoning the model applies before responding. Higher levels produce more thorough analysis at the cost of latency and token usage.
LevelDescription
offNo extended thinking
minimalMinimal reasoning pass
lowLight thinking
mediumDefault — balanced reasoning and speed
highDeep reasoning for complex research tasks
xhighMaximum reasoning budget
# One-shot mode
feynman --prompt "Summarize the key findings in outputs/my-brief.md"

# Force a model for one session
feynman --model anthropic/claude-opus-4-5 --thinking high

# Per-project session isolation
feynman --session-dir ~/projects/myproject/.sessions
--model accepts both / and : as separators (anthropic/claude-sonnet-4-5 and anthropic:claude-sonnet-4-5 are both valid).

Build docs developers (and LLMs) love