Quick Start Guide
This guide will walk you through setting up Genie Helper on your VPS. The platform is designed to run on a self-hosted server with all services managed via PM2.Prerequisites:
- Ubuntu/Debian VPS with at least 8GB RAM
- Node.js 18+ installed
- Docker and Docker Compose installed
- Redis server running
- Nginx or similar reverse proxy
Installation Steps
Configure Environment Variables
Create a
.env file in the project root with the required credentials:Configure MCP Servers
The MCP server configuration is located at
storage/plugins/anythingllm_mcp_servers.json. It should auto-boot on startup via the patched server/utils/boot/index.js.Verify the three MCP servers are configured:- Directus MCP —
scripts/directus-mcp-server.mjs - Ollama MCP —
scripts/ollama-mcp-server.mjs - Stagehand MCP —
scripts/stagehand-mcp-server.mjs
Install Ollama Models
Pull the required Ollama models for AI operations:
The
scout-fast-tag:latest model is a custom SmolLM model for taxonomy classification. You’ll need to build this separately if using the taxonomy features.Build the Dashboard
Build the React SPA for production:The build output will be in
dashboard/dist/ and should be served by your web server.Verify Installation
Check that all services are running:| Service | Port | Status |
|---|---|---|
| anything-llm | 3001 | online |
| agentx-cms | 8055 | online |
| stagehand-server | 3002 | online |
| genie-dashboard | 3100 | online |
| media-worker | — | online |
| anything-collector | — | online |
Access the Platform
Creator Dashboard
Access the main dashboard at
https://yourdomain.com/app/Admin Panel
Access the admin panel at
https://yourdomain.com/adminDirectus CMS
Direct access at
http://localhost:8055/adminAnythingLLM
Direct access at
http://localhost:3001Default Credentials
| Service | Password | |
|---|---|---|
| Directus | admin@geniehelper.com | password |
| AnythingLLM | poweradmin@geniehelper.com | (MY)P@$$w3rd |
38KEHYS-NVPMBSX-GVVJNYH-VQHAN9S
Common Operations
Restart All Services
Rebuild Dashboard After Code Changes
Restart AnythingLLM After Server Changes
View Service Logs
Monitor Service Status
Troubleshooting
Services won't start
Services won't start
Check the logs with
pm2 logs <service-name> to identify the error. Common issues:- Missing environment variables in
.env - Port conflicts (check if ports 3001, 8055, 3002, 3100 are available)
- Redis not running (required for BullMQ)
MCP servers not loading
MCP servers not loading
Verify that
storage/plugins/anythingllm_mcp_servers.json exists and contains valid JSON. Check AnythingLLM logs for MCP boot messages.Nginx 502 errors
Nginx 502 errors
Ensure all upstream services are running:
Ollama models not responding
Ollama models not responding
Verify Ollama is running:If models are missing, re-run
ollama pull commands from Step 5.Next Steps
Now that your services are running:Review Architecture
Learn how services communicate in the Architecture guide
