This guide walks through deploying every component of the BLE People Counter system to a production environment. The three components — the Raspberry Pi scanner agent, the cloud backend, and the frontend dashboard — can be deployed independently, but they must all be configured to talk to each other.Documentation Index
Fetch the complete documentation index at: https://mintlify.com/AngelAmoSanchez/TFG-RaspberryPi-BLE/llms.txt
Use this file to discover all available pages before exploring further.
Raspberry Pi scanner
Installation
Runinstall.sh from the raspberry-pi/ directory on each Pi. The script installs system dependencies, creates a Python virtual environment, grants the required Linux capabilities to the Python binary, and writes a systemd service unit:
sudo at runtime:
.env file with your production settings before starting the service.
Configuration
Copy.env.example to .env and fill in at minimum:
src/config.py:
| Variable | Default | Description |
|---|---|---|
DEVICE_ID | (required) | Unique identifier for this Pi (min 3 chars) |
COMMUNICATION_MODE | mqtt | http or mqtt |
HTTP_BASE_URL | http://localhost:8000 | Backend URL |
HTTP_API_KEY | — | API key for backend authentication |
NEAR_THRESHOLD | -60 | RSSI boundary for NEAR zone (dBm) |
MEDIUM_THRESHOLD | -75 | RSSI boundary for MEDIUM zone (dBm) |
SCAN_DURATION | 10 | Seconds per scan (1–60) |
SCAN_INTERVAL | 30 | Seconds between scans (min 5) |
LOG_LEVEL | INFO | DEBUG, INFO, WARNING, ERROR |
systemd auto-start
install.sh writes the service unit to /etc/systemd/system/ble-scanner.service. Enable it to start on boot:
Log rotation
The agent writes logs to./logs/iot-agent.log with automatic rotation. The RotatingFileHandler keeps up to 3 files of 5 MB each (15 MB total on disk):
OTA updates
Theota/ota_update.py module provides over-the-air updates via Git. When enabled, it checks origin/main on the configured interval and runs git pull when a new commit is available:
auto_restart=True, the updater calls systemctl restart iot-agent after a successful pull. The current deployed commit hash is tracked in ota/version.json.
The OTA updater requires that the Pi has network access to the GitHub repository and that
git is installed (handled by install.sh).Backend
Docker Compose (self-hosted)
Thedocker-compose.yml in backend-cloud/ defines a PostgreSQL 14 database and the FastAPI application:
docker-compose.prod.yml or use an .env file:
Environment variables
Key backend settings frombackend-cloud/src/config.py:
| Variable | Default | Description |
|---|---|---|
SECRET_KEY | dev-secret-key-change-in-production | Change this in production |
DATABASE_URL | postgresql+asyncpg://postgres:postgres@localhost:5432/tfg_detections | PostgreSQL connection string |
ENVIRONMENT | development | Set to production in prod |
DEBUG | true | Set to false in prod |
API_KEY | — | Shared secret used by Pi agents |
DEVICES_PER_PERSON | 1.5 | Divisor for people estimate |
NEAR_THRESHOLD | -60 | RSSI boundary for NEAR zone |
MEDIUM_THRESHOLD | -75 | RSSI boundary for MEDIUM zone |
CORS_ORIGINS | localhost variants + *.vercel.app | Allowed frontend origins |
Fly.io deployment
The project includes afly.toml targeting the cdg (Paris) region:
Health check
The backend exposes a/health endpoint. Use it to verify the deployment:
Frontend
Vercel deployment
Connect the repository to Vercel
Import the repository in the Vercel dashboard or run
vercel from the frontend/ directory.Set the backend URL environment variable
Add This variable must be set for both Preview and Production environments.
VITE_API_URL in the Vercel project settings (Environment Variables):CORS configuration
The backend’s allowed origins list inbackend-cloud/src/config.py includes https://*.vercel.app by default:
CORS_ORIGINS environment variable if that is wired up) and redeploy the backend.
Multi-Pi setup
To monitor multiple areas simultaneously, deploy one Pi per zone and give each a uniqueDEVICE_ID. All Pis point to the same backend:
device_id field, so the dashboard can display per-zone counts by Pi. There is no additional backend configuration required for multi-Pi operation — the backend handles data from any number of devices as long as they share the same API_KEY.