Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/jolehuit/clother/llms.txt

Use this file to discover all available pages before exploring further.

Every built-in provider ships with a default model. You can change it in two ways.

One-time override

Pass --model directly when launching. The flag is forwarded to Claude CLI.
clother-zai --model glm-4.7
clother-alibaba --model kimi-k2.5
clother-ollama --model qwen3-coder
The override applies only to that single session. The configured default is unchanged.

Permanent override

Run clother config <provider> and choose a different default model. Clother writes the override to config.json and uses it for all future launches of that provider.
clother config zai
# Choose model:
#   1. glm-5     GLM-5
#   2. glm-4.7   GLM-4.7
# Model [glm-5]: 2
To see what model is currently in effect for a provider:
clother info zai

Provider default models

ProviderLauncherDefault modelOther available models
Anthropic (native)clother-native(Claude default)
Z.AIclother-zaiglm-5glm-4.7
Z.AI Chinaclother-zai-cnglm-5glm-4.7
MiniMaxclother-minimaxMiniMax-M2.5MiniMax-M2.5-highspeed, MiniMax-M2.1, MiniMax-M2.1-highspeed, MiniMax-M2
MiniMax Chinaclother-minimax-cnMiniMax-M2.5MiniMax-M2.5-highspeed, MiniMax-M2.1, MiniMax-M2.1-highspeed, MiniMax-M2
Kimiclother-kimikimi-k2.5
Moonshot AIclother-moonshotkimi-k2.5
DeepSeekclother-deepseekdeepseek-chat
Xiaomi MiMoclother-mimomimo-v2-flash
Alibaba (Singapore)clother-alibabaqwen3.5-pluskimi-k2.5, glm-5, MiniMax-M2.5, qwen3-coder-next, qwen3-coder-plus, qwen3-max-2026-01-23, glm-4.7
Alibaba (US)clother-alibaba-usqwen3.5-pluskimi-k2.5, glm-5, MiniMax-M2.5, qwen3-coder-next, qwen3-coder-plus, qwen3-max-2026-01-23, glm-4.7
Alibaba Chinaclother-alibaba-cnqwen3.5-pluskimi-k2.5, glm-5, MiniMax-M2.5, qwen3-coder-next, qwen3-coder-plus, qwen3-max-2026-01-23, glm-4.7
VolcEngineclother-vedoubao-seed-code-preview-latest
Ollamaclother-ollama(specify via --model)any model pulled locally
LM Studioclother-lmstudio(specify via --model)any model loaded in LM Studio
llama.cppclother-llamacpp(specify via --model)any model served by llama-server

OpenRouter

OpenRouter launchers follow the clother-or-<alias> naming pattern. The model is bound to the alias at configuration time, not at launch time.
1

Run the OpenRouter configurator

clother config openrouter
2

Enter a model ID and alias

Model ID (empty to stop): moonshotai/kimi-k2.5
Alias [kimi-k2-5]: kimi-k25
Find model IDs at openrouter.ai/models.
3

Use the launcher

clother-or-kimi-k25
To use a different model through the same alias, re-run clother config openrouter and provide a new model ID for the same alias name.
If a model doesn’t behave as expected with tool calls, try the :exacto variant — for example moonshotai/kimi-k2-0905:exacto.

Build docs developers (and LLMs) love