Use this file to discover all available pages before exploring further.
Docker is the fastest way to run DeepWiki Open. A single image bundles the Python API server and the Next.js frontend together, so you only need to supply your API keys and run one command. All cloned repositories, embeddings, and cached wiki content are stored in ~/.adalflow on your host machine, so your data persists across container restarts.
Docker Compose
Docker Run
Build locally
Docker Compose is the recommended approach. It reads your API keys from a .env file and handles port mapping, volume mounts, and health checks automatically.
Create a .env file in the project root with your API keys:
.env
GOOGLE_API_KEY=your_google_api_keyOPENAI_API_KEY=your_openai_api_key# Optional: Use Google AI embeddings instead of OpenAIDEEPWIKI_EMBEDDER_TYPE=google# Optional: OpenRouter modelsOPENROUTER_API_KEY=your_openrouter_api_key# Optional: Azure OpenAI modelsAZURE_OPENAI_API_KEY=your_azure_openai_api_keyAZURE_OPENAI_ENDPOINT=your_azure_openai_endpointAZURE_OPENAI_VERSION=your_azure_openai_version# Optional: External Ollama serverOLLAMA_HOST=your_ollama_host
3
Start the stack
docker-compose up
Docker Compose will build the image (first run only), start the container, and expose the API on port 8001 and the frontend on port 3000. Open http://localhost:3000 once the health check passes.The docker-compose.yml pre-configures the ~/.adalflow volume mount for data persistence and binds ./api/logs so log files survive restarts.
Pull the pre-built image from GitHub Container Registry and run it directly with environment variables:
# Pull the imagedocker pull ghcr.io/asyncfuncai/deepwiki-open:latest# Run the containerdocker run -p 8001:8001 -p 3000:3000 \ -e GOOGLE_API_KEY=your_google_api_key \ -e OPENAI_API_KEY=your_openai_api_key \ -e OPENROUTER_API_KEY=your_openrouter_api_key \ -e OLLAMA_HOST=your_ollama_host \ -e AZURE_OPENAI_API_KEY=your_azure_openai_api_key \ -e AZURE_OPENAI_ENDPOINT=your_azure_openai_endpoint \ -e AZURE_OPENAI_VERSION=your_azure_openai_version \ -v ~/.adalflow:/root/.adalflow \ ghcr.io/asyncfuncai/deepwiki-open:latest
Only include the -e flags for the providers you actually use. At minimum you need one of GOOGLE_API_KEY or OPENAI_API_KEY.
The -v ~/.adalflow:/root/.adalflow mount is what makes your data persist between container runs. DeepWiki writes three categories of data to this path:
Path
Contents
~/.adalflow/repos/
Cloned repository source code
~/.adalflow/databases/
Embeddings and vector indexes
~/.adalflow/wikicache/
Cached generated wiki pages
Without this volume mount the container starts fresh on every run — all previously generated wikis and their embeddings are lost when the container stops.
Embeddings are model-specific. If you switch DEEPWIKI_EMBEDDER_TYPE from openai to google (or to ollama), the existing vectors in ~/.adalflow/databases/ are incompatible with the new model and must be regenerated. Delete the databases directory or re-generate wikis for any repository you want to keep using.