EcliPanel’s AI feature routes requests through a managed model registry, handling fallover across multiple upstream endpoints automatically. Users and organisations are linked to specific models by an administrator. The OpenAI-compatible endpoints let you point standard OpenAI client libraries at EcliPanel without code changes.Documentation Index
Fetch the complete documentation index at: https://mintlify.com/thenoname-gurl/EcliPanel/llms.txt
Use this file to discover all available pages before exploring further.
All AI endpoints return
503 with {"error":"feature_disabled"} when the ai feature flag is off. Enable the flag in Admin → Settings before using these routes.Chat
Send a chat message
POST /api/ai/chat
Sends a single message to the AI model assigned to the authenticated user or their organisation. Conversation history can be passed in to maintain context across turns.
The user’s message text.
Explicit model ID to use. If omitted, the user’s assigned model is used.
System-level instruction prepended to the conversation.
Previous conversation turns.
The AI model’s response text. If no model is configured, a user-friendly fallback message is returned instead of an error.
AI Studio
POST /api/ai/studio
Advanced AI invocation endpoint with extended configuration options for power users. Requires the paid or higher portal tier.
The user’s message text.
Explicit model ID.
System prompt override.
Maximum tokens in the response.
Sampling temperature (0.0–2.0).
Conversation history in the same format as
/api/ai/chat.Model discovery
List all models
GET /api/ai/models
Returns all AI models registered in the panel. API keys and endpoint credentials are stripped from the response. Available to all authenticated users.
Model ID.
Model display name.
Descriptive tags assigned by the administrator.
List your accessible models
GET /api/ai/my-models
Returns the models linked to the authenticated user or to any organisation they belong to, including per-link usage limits.
OpenAI-compatible proxy
These endpoints accept the standard OpenAI request format. Point OpenAI-compatible clients or libraries at your EcliPanel instance by changing thebaseURL.
Chat completions
POST /api/ai/openai/v1/chat/completions
Proxies a standard OpenAI chat completions request to the model assigned to the authenticated user. The model field in the request body is replaced with the provider model ID from the panel configuration.
Conversation messages in OpenAI format.
Optional EcliPanel model ID override. If omitted, the user’s assigned model is used.
Sampling temperature.
Maximum completion tokens.
Whether to stream the response using server-sent events.
Text completions
POST /api/ai/openai/v1/completions
Proxies an OpenAI-style text completion request (non-chat format).
The prompt text to complete.
Optional EcliPanel model ID override.
Maximum completion tokens.
Sampling temperature.