Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/cloudwaddie/lmarenabridge/llms.txt

Use this file to discover all available pages before exploring further.

OpenWebUI is a user-friendly web interface for large language models. LMArena Bridge exposes an OpenAI-compatible API at http://localhost:8000/api/v1, which OpenWebUI can use as a connection backend to give you access to all LMArena models through a polished chat interface.

Connect OpenWebUI to LMArena Bridge

1

Start LMArena Bridge

Make sure LMArena Bridge is running before opening OpenWebUI.
python -m src.main
The server starts on http://localhost:8000.
2

Open OpenWebUI in your browser

Navigate to your OpenWebUI instance in a web browser.
3

Go to Admin Panel settings

Open your Profile, then select Admin PanelSettingsConnections.
4

Modify the OpenAI connection

Locate the existing OpenAI connection entry and open it for editing.
5

Set the API base URL

Set the API Base URL to:
http://localhost:8000/api/v1
6

Set the API key

The API Key field can be left empty or set to any value. LMArena Bridge does not use this field for authentication unless you have configured API keys in the dashboard.
7

Save and select a model

Save the connection. All LMArena models will appear in the model selector. Choose one and start chatting.
All models available on LMArena will appear in the OpenWebUI model list once the connection is saved.
Always start LMArena Bridge before launching or using OpenWebUI. If OpenWebUI reports a connection error, confirm the bridge is running at http://localhost:8000.

Build docs developers (and LLMs) love