This guide walks you through cloning the repository, configuring your environment variables, and launching the complete NextAudit AI stack. By the end you will have all services running locally and the FleetDM, Flowise, and n8n dashboards accessible in your browser.Documentation Index
Fetch the complete documentation index at: https://mintlify.com/Kevin2523/nextAuditAi/llms.txt
Use this file to discover all available pages before exploring further.
You need Docker and Docker Compose installed before proceeding. The stack includes eight containers, so ensure your machine has at least 8 GB of available RAM.
Configure environment variables
Copy the environment variable template for your target environment. Three templates are provided — one for each deployment tier. These files live in Open the copied
src/ai-sentinel/ alongside the compose files.The
.env.example template files are distributed on the develop branch. Switch to develop to access them, or create src/ai-sentinel/.env manually using the Environment variables reference as a guide..env file and fill in the required values. Key variables to set include:| Variable | Description |
|---|---|
POSTGRES_USER / POSTGRES_PASSWORD / POSTGRES_DB | PostgreSQL credentials for the AI layer |
MYSQL_USER / MYSQL_PASSWORD / MYSQL_DATABASE | MySQL credentials for FleetDM |
FLEET_SERVER_PRIVATE_KEY | Generate with openssl rand -base64 32 |
OLLAMA_MODELS | Comma-separated list of models to pull on startup |
FLOWISE_PORT / N8N_PORT / OLLAMA_PORT | Host ports for each service dashboard |
Launch the development stack
Start all services in detached mode using the development compose file.For other environments, substitute the appropriate compose file:
In development, the
ollama and postgres services are built from local Dockerfiles under src/ai-sentinel/ollama/ and src/ai-sentinel/postgres/. In test and production, versioned images from jjsotom2k4/ollama-ai and jjsotom2k4/postgres-ai are used instead.Verify all services are running
Check that all eight containers started successfully.You should see the following services with a
running or healthy status:| Service | Role |
|---|---|
fleet | FleetDM API and UI |
fleet-init | One-time volume permission setup (exits after completion) |
mysql | FleetDM database |
redis | FleetDM session cache |
ollama | Local LLM inference |
postgres | AI layer database with pgvector |
flowise | AI agent flow builder |
n8n | Workflow automation engine |
Access the dashboards
Once all services are healthy, open the following dashboards in your browser. The exact ports depend on the values you set in your
Log in to FleetDM to start enrolling endpoints, then open n8n to configure your first audit workflow and Flowise to build AI agent flows against your fleet data.
.env file.| Service | Default variable | URL pattern |
|---|---|---|
| FleetDM | FLEET_SERVER_PORT | http://localhost:$FLEET_SERVER_PORT |
| Flowise | FLOWISE_PORT | http://localhost:$FLOWISE_PORT |
| n8n | N8N_PORT | http://localhost:$N8N_PORT |
| Ollama API | OLLAMA_PORT | http://localhost:$OLLAMA_PORT |
Next steps
Architecture
Understand the service dependencies and data flows in the stack.
Environment variables
Full reference for all configuration variables across environments.
Fleet management
Enroll endpoints and configure osquery policies in FleetDM.
AI analysis
Build AI agent flows in Flowise to analyze your fleet data.