DeepWiki Open is configured entirely through environment variables and JSON config files. Set these variables in aDocumentation Index
Fetch the complete documentation index at: https://mintlify.com/AsyncFuncAI/deepwiki-open/llms.txt
Use this file to discover all available pages before exploring further.
.env file at the project root or pass them directly to your Docker container. All variables are optional unless explicitly marked as required for a specific feature.
API keys
Each model provider and embedding provider requires its own API key. You only need the keys for the providers you intend to use.API key for Google Gemini models and Google AI embeddings. Required when using the
google provider for text generation or when DEEPWIKI_EMBEDDER_TYPE=google. Obtain from Google AI Studio.API key for OpenAI models and embeddings. Required when using the
openai provider for text generation or when DEEPWIKI_EMBEDDER_TYPE=openai (the default embedder). Also required when using OpenAI-compatible embedding endpoints via OPENAI_BASE_URL. Obtain from OpenAI Platform.API key for OpenRouter, which provides access to models from OpenAI, Anthropic, Google, Meta, Mistral, and more through a single API. Required only when using the
openrouter provider. Obtain from openrouter.ai.API key for your Azure OpenAI resource. Required when using the
azure provider. Must be combined with AZURE_OPENAI_ENDPOINT and AZURE_OPENAI_VERSION. Obtain from the Azure Portal.The endpoint URL for your Azure OpenAI resource (e.g.
https://my-resource.openai.azure.com/). Required when using the azure provider.The Azure OpenAI API version string (e.g.
2024-02-01). Required when using the azure provider.AWS access key ID for authenticating with Amazon Bedrock. Required for Bedrock unless you are using instance-level or role-based IAM credentials.
AWS secret access key paired with
AWS_ACCESS_KEY_ID. Required for Bedrock when not using instance or role-based credentials.AWS session token for temporary STS credentials. Required only when using short-lived credentials obtained via
AssumeRole or similar mechanisms.AWS region where Bedrock is accessed. Defaults to
us-east-1 when not set.ARN of an IAM role to assume before calling Bedrock. When set, the Bedrock client will call STS
AssumeRole automatically. Leave unset if your credentials already have the necessary permissions.Base URL of the Ollama server. Defaults to
http://localhost:11434. Set this only when your Ollama instance runs on a remote host or a non-default port.Embedder configuration
Selects the embedding provider used to create vector representations of repository code. Accepted values are
openai, google, ollama, and bedrock. Defaults to openai.Changing this value after a repository has already been indexed requires re-generating the embeddings, because different models produce incompatible vector spaces.Overrides the base URL for OpenAI API calls. Use this to point the OpenAI client at a private or enterprise endpoint, or at any OpenAI-compatible third-party service (such as Alibaba Qwen). When set, the same key is also used by the embedder if
DEEPWIKI_EMBEDDER_TYPE=openai.Example: https://custom-api-endpoint.com/v1Server configuration
Port that the API server listens on. Defaults to
8001. If you change this, update SERVER_BASE_URL to match so the frontend can reach the backend.Full base URL the frontend uses to reach the API. Defaults to
http://localhost:8001. Update this whenever you change PORT or deploy the API to a remote host.Authorization
Enables authorization mode when set to
true or 1. In this mode the frontend displays an input field for a secret code before allowing wiki generation. Defaults to false.The secret code users must supply when
DEEPWIKI_AUTH_MODE is enabled. Has no effect when auth mode is disabled. Only used if DEEPWIKI_AUTH_MODE is true or 1.Logging
Controls the verbosity of application logs. Accepted values are
DEBUG, INFO, WARNING, ERROR, and CRITICAL. Defaults to INFO.Path where log output is written. Defaults to
api/logs/application.log. The application enforces that this path resides within the project’s api/logs directory to prevent path traversal. When running via Docker Compose, api/logs is bind-mounted to ./api/logs on the host so logs persist across container restarts.Configuration directory
Path to a directory containing custom
generator.json, embedder.json, and repo.json config files. When not set, DeepWiki looks for these files in the api/config/ directory bundled with the source code. Set this when you want to supply your own model or embedder configuration without modifying the source.