Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/Y-Research-SBU/QuantAgent/llms.txt

Use this file to discover all available pages before exploring further.

QuantAgent supports three LLM providers. Each analysis run uses two separate models: an agent_llm for the Indicator, Pattern, and Trend agents, and a graph_llm for the Decision agent that synthesises the final LONG/SHORT trade directive.
The Pattern and Trend agents pass chart images to the model. Your chosen provider and model must support vision (image) input. All default models listed below satisfy this requirement.

Supported providers

Default provider. Uses the OpenAI Chat Completions API.
RoleDefault model
Agent LLMgpt-4o-mini
Graph LLMgpt-4o

Set the API key

Environment variable (recommended)
export OPENAI_API_KEY="sk-..."
Config dict
config = {
    "agent_llm_provider": "openai",
    "graph_llm_provider": "openai",
    "agent_llm_model": "gpt-4o-mini",
    "graph_llm_model": "gpt-4o",
    "api_key": "sk-...",
}
Web UIOpen Settings, select OpenAI, paste your key, and click Save. The key is applied immediately via POST /api/update-api-key.Obtain a key at platform.openai.com/api-keys.

Switching providers

Web UI

Open Settings, choose the provider from the dropdown, and click Apply. This sends a POST /api/update-provider request that updates both agent_llm_provider and graph_llm_provider simultaneously and swaps the model names to the defaults for that provider.

Programmatically

Update the config and call refresh_llms() to rebuild the agent and graph LLMs without recreating the full TradingGraph object:
from trading_graph import TradingGraph

tg = TradingGraph()

# Switch to Anthropic
tg.config["agent_llm_provider"] = "anthropic"
tg.config["graph_llm_provider"] = "anthropic"
tg.config["agent_llm_model"] = "claude-haiku-4-5-20251001"
tg.config["graph_llm_model"] = "claude-haiku-4-5-20251001"
tg.config["anthropic_api_key"] = "sk-ant-..."
tg.refresh_llms()
Or use update_api_key() which calls refresh_llms() for you:
tg.update_api_key("sk-ant-...", provider="anthropic")

Configuration reference

All provider settings live in default_config.py:
DEFAULT_CONFIG = {
    "agent_llm_model": "gpt-4o-mini",
    "graph_llm_model": "gpt-4o",
    "agent_llm_provider": "openai",   # "openai", "anthropic", or "qwen"
    "graph_llm_provider": "openai",
    "agent_llm_temperature": 0.1,
    "graph_llm_temperature": 0.1,
    "api_key": "sk-",                 # OpenAI key
    "anthropic_api_key": "sk-",       # Anthropic key
    "qwen_api_key": "sk-",            # Qwen / DashScope key
}
You can pass a full or partial override dict to TradingGraph(config=...). Any key not present in your dict falls back to the default.

Build docs developers (and LLMs) love