Prerequisites
Before you start, make sure you have:- macOS with Xcode and iOS Simulator installed
- Node.js 18+ (
node --versionto check) - Facebook idb (recommended — enables cursor-free touch injection)
Install idb
Build from Source
Install dependencies
If idb is not found during install, the postinstall script prints a reminder with the install commands. This is informational — the install still succeeds.
Build the project
dist/index.js and builds the native Swift binary at dist/mouse-events (used as the CGEvent fallback for touch injection).Connect to an MCP Client
Replace
/path/to/Preflight with the absolute path to where you cloned the repository. If you installed idb via pip, also add your Python bin directory to PATH — for example, ~/Library/Python/3.x/bin..mcp.json in your project root. After saving the config, your MCP client will automatically start the Preflight server when needed.
Your First Automation
With Preflight connected, ask your AI agent to run the following sequence. This boots a simulator and takes a screenshot to verify everything is working.Boot a simulator
Ask your agent:
Boot the iPhone 16 Pro simulator.Preflight calls
simulator_boot with the device name. It polls simulator_list_devices until the device state is Booted.Take a screenshot
Ask your agent:
Take a screenshot of the simulator.Preflight calls
simulator_screenshot and returns a JPEG image directly in chat — no files saved to disk.Inspect the screen without vision
For a more AI-efficient approach, use the accessibility snapshot instead of a screenshot:
Take a snapshot of the current screen and describe what you see.Preflight calls
simulator_snapshot, which returns a structured accessibility tree — element roles, labels, values, and coordinates — without needing a vision model.What’s Next
Installation
Detailed installation guide including Xcode setup, idb PATH configuration, and build output details.
Tools Reference
All 57 tools across 10 categories — observation, interaction, device management, debugging, and more.
