Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/Y-Research-SBU/QuantAgent/llms.txt

Use this file to discover all available pages before exploring further.

QuantAgent ships with a DEFAULT_CONFIG dictionary that controls which LLM models and providers are used across the multi-agent system. You can override any or all of these values by passing a config dict when instantiating TradingGraph.
from trading_graph import TradingGraph

graph = TradingGraph(config={
    "agent_llm_provider": "anthropic",
    "agent_llm_model": "claude-haiku-4-5-20251001",
    "graph_llm_provider": "anthropic",
    "graph_llm_model": "claude-haiku-4-5-20251001",
    "anthropic_api_key": "sk-ant-...",
})
Any key you omit falls back to the default value shown below.

Default configuration

DEFAULT_CONFIG = {
    "agent_llm_model": "gpt-4o-mini",
    "graph_llm_model": "gpt-4o",
    "agent_llm_provider": "openai",
    "graph_llm_provider": "openai",
    "agent_llm_temperature": 0.1,
    "graph_llm_temperature": 0.1,
    "api_key": "sk-",
    "anthropic_api_key": "sk-",
    "qwen_api_key": "sk-",
}

Config options

Model selection

agent_llm_model
string
default:"gpt-4o-mini"
The model used by the Indicator, Pattern, and Trend agents. These agents perform focused sub-tasks and can use a lighter, cheaper model. Must support image inputs because Pattern and Trend agents pass chart images. See Model selection for per-provider defaults.
graph_llm_model
string
default:"gpt-4o"
The model used by the Decision agent (graph-level logic). This model synthesizes all sub-agent reports into a final trade decision and should be a more capable model. See Model selection for per-provider defaults.

Provider selection

agent_llm_provider
string
default:"openai"
LLM provider for the agent model. Accepted values: "openai", "anthropic", "qwen".
graph_llm_provider
string
default:"openai"
LLM provider for the graph/decision model. Accepted values: "openai", "anthropic", "qwen".
You can mix providers — for example, use "qwen" for agents and "openai" for graph logic. Each provider is initialized with its own API key.

Temperature

agent_llm_temperature
number
default:"0.1"
Sampling temperature for agent LLM responses. Kept low (0.1) by default to produce consistent, deterministic technical analysis. Valid range: 0.02.0.
graph_llm_temperature
number
default:"0.1"
Sampling temperature for graph/decision LLM responses. Kept low (0.1) by default. Valid range: 0.02.0.

API keys

api_key
string
Your OpenAI API key. If omitted or set to the placeholder "sk-", QuantAgent falls back to the OPENAI_API_KEY environment variable. See API keys.
anthropic_api_key
string
Your Anthropic API key. Falls back to the ANTHROPIC_API_KEY environment variable. Required only when agent_llm_provider or graph_llm_provider is "anthropic". See API keys.
qwen_api_key
string
Your Qwen/DashScope API key. Falls back to the DASHSCOPE_API_KEY environment variable. Required only when agent_llm_provider or graph_llm_provider is "qwen". See API keys.

Passing config to TradingGraph

TradingGraph accepts an optional config dict. When provided, it replaces the entire default config — so supply all keys you want to customize:
from trading_graph import TradingGraph

config = {
    "agent_llm_model": "gpt-4o-mini",
    "graph_llm_model": "gpt-4o",
    "agent_llm_provider": "openai",
    "graph_llm_provider": "openai",
    "agent_llm_temperature": 0.1,
    "graph_llm_temperature": 0.1,
    "api_key": "sk-...",
}

trading_graph = TradingGraph(config=config)
When config is None, TradingGraph uses a copy of DEFAULT_CONFIG:
# Uses DEFAULT_CONFIG
trading_graph = TradingGraph()

Build docs developers (and LLMs) love