Skip to main content

Quick Start Guide

This guide will walk you through setting up Genie Helper on your VPS. The platform is designed to run on a self-hosted server with all services managed via PM2.
Prerequisites:
  • Ubuntu/Debian VPS with at least 8GB RAM
  • Node.js 18+ installed
  • Docker and Docker Compose installed
  • Redis server running
  • Nginx or similar reverse proxy

Installation Steps

1

Clone the Repository

Clone the Genie Helper source code to your server:
git clone <repository-url> agentx
cd agentx
2

Install Dependencies

Install Node.js dependencies for all services:
# Install main server dependencies
npm install

# Install dashboard dependencies
cd dashboard
npm install
cd ..

# Install media worker dependencies
cd media-worker
npm install
cd ..
3

Configure Environment Variables

Create a .env file in the project root with the required credentials:
# Credentials encryption key (base64 of 32 bytes)
CREDENTIALS_ENC_KEY_B64=<your-base64-key>

# Directus admin token
DIRECTUS_ADMIN_TOKEN=<your-directus-token>

# RBAC sync webhook secret
RBAC_SYNC_WEBHOOK_SECRET=<your-webhook-secret>

# AnythingLLM configuration
SERVER_PORT=3001
STORAGE_DIR=./storage

# Directus configuration
DIRECTUS_URL=http://localhost:8055

# Ollama configuration
OLLAMA_BASE_URL=http://localhost:11434
Generate a secure 32-byte key for CREDENTIALS_ENC_KEY_B64:
node -e "console.log(require('crypto').randomBytes(32).toString('base64'))"
4

Configure MCP Servers

The MCP server configuration is located at storage/plugins/anythingllm_mcp_servers.json. It should auto-boot on startup via the patched server/utils/boot/index.js.Verify the three MCP servers are configured:
  • Directus MCPscripts/directus-mcp-server.mjs
  • Ollama MCPscripts/ollama-mcp-server.mjs
  • Stagehand MCPscripts/stagehand-mcp-server.mjs
5

Install Ollama Models

Pull the required Ollama models for AI operations:
ollama pull dolphin3:8b-llama3.1-q4_K_M
ollama pull dolphin-mistral:7b
ollama pull qwen-2.5:latest
ollama pull phi-3.5:latest
ollama pull llama3.2:3b
ollama pull bge-m3:latest
The scout-fast-tag:latest model is a custom SmolLM model for taxonomy classification. You’ll need to build this separately if using the taxonomy features.
6

Build the Dashboard

Build the React SPA for production:
cd dashboard
npm run build
cd ..
The build output will be in dashboard/dist/ and should be served by your web server.
7

Configure Nginx Reverse Proxy

Set up Nginx to proxy requests to the various services:
# SPA fallback for React Router
error_page 404 =200 /index.html;

# Directus API proxy
location /api/directus/ {
    proxy_pass http://127.0.0.1:8055/;
    proxy_set_header Host $host;
}

# AnythingLLM proxy (with WebSocket support)
location /api/llm/ {
    proxy_pass http://127.0.0.1:3001/;
    proxy_http_version 1.1;
    proxy_set_header Upgrade "upgrade";
    proxy_set_header Connection "upgrade";
    proxy_set_header Host $host;
    proxy_read_timeout 86400;
}
Plesk Users: If using Plesk, add these rules via Domains → Apache & nginx Settings → Additional nginx directives. NEVER add location / — Plesk generates it automatically.
8

Start Services with PM2

Use PM2 to manage all services:
# Start AnythingLLM
pm2 start npm --name "anything-llm" -- start

# Start Directus CMS
pm2 start npm --name "agentx-cms" --cwd ./cms -- start

# Start Stagehand server
pm2 start npm --name "stagehand-server" --cwd ./stagehand -- start

# Start Dashboard (serve static files)
pm2 start npx --name "genie-dashboard" -- serve dashboard/dist -l 3100

# Start Media Worker
pm2 start npm --name "media-worker" --cwd ./media-worker -- start

# Start Document Collector
pm2 start npm --name "anything-collector" -- run collector

# Save PM2 process list
pm2 save
pm2 startup

Verify Installation

Check that all services are running:
pm2 status
You should see the following services:
ServicePortStatus
anything-llm3001online
agentx-cms8055online
stagehand-server3002online
genie-dashboard3100online
media-workeronline
anything-collectoronline

Access the Platform

Creator Dashboard

Access the main dashboard at https://yourdomain.com/app/

Admin Panel

Access the admin panel at https://yourdomain.com/admin

Directus CMS

Direct access at http://localhost:8055/admin

AnythingLLM

Direct access at http://localhost:3001

Default Credentials

Change these credentials before public launch!
ServiceEmailPassword
Directusadmin@geniehelper.compassword
AnythingLLMpoweradmin@geniehelper.com(MY)P@$$w3rd
AnythingLLM API Key: 38KEHYS-NVPMBSX-GVVJNYH-VQHAN9S

Common Operations

Restart All Services

pm2 restart all

Rebuild Dashboard After Code Changes

cd dashboard && npm run build

Restart AnythingLLM After Server Changes

pm2 restart anything-llm

View Service Logs

# View AnythingLLM logs
pm2 logs anything-llm --lines 50

# View Media Worker logs
pm2 logs media-worker --lines 50

# View all logs
pm2 logs

Monitor Service Status

# Real-time monitoring
pm2 monit

# Service metrics
pm2 status

Troubleshooting

Check the logs with pm2 logs <service-name> to identify the error. Common issues:
  • Missing environment variables in .env
  • Port conflicts (check if ports 3001, 8055, 3002, 3100 are available)
  • Redis not running (required for BullMQ)
Verify that storage/plugins/anythingllm_mcp_servers.json exists and contains valid JSON. Check AnythingLLM logs for MCP boot messages.
Ensure all upstream services are running:
curl http://localhost:3001  # AnythingLLM
curl http://localhost:8055  # Directus
Verify Ollama is running:
ollama list
curl http://localhost:11434/api/tags
If models are missing, re-run ollama pull commands from Step 5.

Next Steps

Now that your services are running:
1

Review Architecture

Learn how services communicate in the Architecture guide
2

Connect Platforms

Link your OnlyFans and other creator platforms via the dashboard
3

Configure AI Agent

Customize the AI workspace and system prompts in AnythingLLM

Build docs developers (and LLMs) love