Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/cloudwaddie/lmarenabridge/llms.txt

Use this file to discover all available pages before exploring further.

LMArena Bridge is a self-hosted API server that lets you talk to LMArena models — including experimental stealth models — using the same OpenAI API format your existing tools already understand. Point any OpenAI-compatible client at http://localhost:8000/api/v1 and start chatting immediately.

Quickstart

Install LMArena Bridge and make your first API call in minutes.

Authentication

Get your LMArena auth token and configure it in the dashboard.

API Reference

Full documentation for every endpoint: chat, models, health, and more.

Integrations

Connect LMArena Bridge to OpenWebUI and other OpenAI-compatible frontends.

How it works

LMArena Bridge runs a FastAPI server on localhost:8000 that accepts standard OpenAI /v1/chat/completions requests and translates them into LMArena’s internal API format. Because LMArena uses Cloudflare protection and reCAPTCHA v3/v2 challenges, the bridge automates a real browser session in the background to handle those checks transparently.

Transport overview

Learn how the bridge selects between direct httpx, Chrome, Camoufox, and userscript proxy transports.

Configuration

Set auth tokens, API keys, rate limits, and browser window modes via config.json.

Production deployment

Error handling, monitoring, security best practices, and debug mode.

Troubleshooting

Common errors and their solutions for token issues, timeouts, and image uploads.

Key features

Implements POST /api/v1/chat/completions and GET /api/v1/models with the same request and response shapes as the OpenAI API, including streaming via Server-Sent Events.
Automatically switches between direct HTTP, headless Chrome, Firefox-based Camoufox, and a browser userscript proxy depending on the model and current challenge state.
Manages multiple arena-auth-prod-v1 tokens in round-robin, detecting expiry and refreshing via LMArena HTTP or Supabase before requests fail.
Accepts vision messages with base64-encoded images, uploads them to LMArena R2 storage, and includes the signed URLs in the request. Supports PNG, JPEG, GIF, WebP, and SVG up to 10 MB.
A built-in web UI at /dashboard lets you manage auth tokens, create and revoke API keys, view per-key usage stats, and trigger manual token refreshes.
LMArena actively deploys Cloudflare checkpoints, captchas, and reCAPTCHA Enterprise. The bridge works around these automatically, but reliability may vary as LMArena updates its defenses.

Build docs developers (and LLMs) love