Overview
Beacon supports extensive configuration through environment variables and runtime parameters. This guide covers all available configuration options for both CLI and server deployments.Environment Variables
AI Provider Configuration
Beacon supports multiple AI providers for inferring repository capabilities. Configure your preferred provider using these variables:The default provider is Gemini (gemini-2.5-flash). You can override this with the
--provider flag or provider field in API requests.Database Configuration
Beacon uses Supabase (PostgreSQL) for tracking runs and payments when using Beacon Cloud:Redis Configuration
Redis is required for rate limiting when running the Beacon server:Payment Configuration
When running Beacon Cloud with payment verification:Payment verification uses on-chain transaction validation. See Payment Verification for implementation details.
Beacon Cloud Endpoint
Override the default Beacon Cloud endpoint:- Self-hosting a Beacon Cloud instance
- Development and testing
- Using a custom AI provider endpoint
Logging Configuration
Beacon usestracing for structured logging:
CLI Configuration
Generate Command
| Option | Description | Default |
|---|---|---|
--output, -o | Output file path | AGENTS.md |
--provider | AI provider (gemini, claude, openai, beacon-ai-cloud) | gemini |
--api-key | API key for the chosen provider | From env var |
Validate Command
| Option | Description | Default |
|---|---|---|
--check-endpoints | Verify endpoint reachability | false |
--provider | Use AI-powered validation | none |
Serve Command
| Option | Description | Default |
|---|---|---|
--port, -p | Server port | 8080 |
Register Command
| Option | Description | Default |
|---|---|---|
--chain | Blockchain network (base, solana) | base |
--agency | Agency identifier | none |
Configuration File
Beacon automatically loads environment variables from a.env file in the current directory:
.env
Docker Configuration
When running Beacon in Docker, pass environment variables via-e flags or an env file:
Production Checklist
Before deploying Beacon to production:- Set appropriate
RUST_LOGlevel (recommendinfoorwarn) - Configure Redis with persistence and replication
- Use managed database services (Supabase, Render Postgres)
- Set up monitoring for rate limit metrics
- Rotate API keys regularly
- Enable TLS for Redis connections (
rediss://) - Configure proper CORS headers if needed
- Set resource limits (CPU, memory) in container orchestration
- Enable request logging and distributed tracing
Advanced Tuning
Rate Limit Configuration
Rate limits are hardcoded insrc/main.rs:36-37:
src/main.rs:36-37
AI Provider Models
The AI models are defined insrc/inferrer.rs:
- Gemini:
gemini-2.5-flash(line 19) - Claude:
claude-sonnet-4-5(line 97) - OpenAI:
gpt-4o(line 124)
Payment Amount
The default payment amount is0.09 USDC. Override with:
src/verifier.rs with a tolerance of 0.001 USDC.
Next Steps
Payment Verification
Deep dive into blockchain payment verification
Rate Limiting
Redis-based rate limiting implementation
Custom Deployment
Production deployment strategies