Skip to main content
Preflight MCP is an MCP server that gives AI agents full control over iOS Simulators — tap, swipe, screenshot, inspect accessibility trees, manage devices, and more. This guide gets you running your first automation.

Prerequisites

Before you start, make sure you have:
  • macOS with Xcode and iOS Simulator installed
  • Node.js 18+ (node --version to check)
  • Facebook idb (recommended — enables cursor-free touch injection)
idb is optional but strongly recommended. Without it, Preflight falls back to CGEvent mouse injection, which briefly moves your Mac cursor during touch events.

Install idb

1

Add the Facebook Homebrew tap

brew tap facebook/fb
2

Install idb-companion

brew install idb-companion
3

Install the idb Python client

pip3 install fb-idb

Build from Source

1

Clone the repository

git clone https://github.com/EthanAckerman-git/Preflight.git
cd Preflight
2

Install dependencies

npm install
If idb is not found during install, the postinstall script prints a reminder with the install commands. This is informational — the install still succeeds.
3

Build the project

npm run build
This compiles TypeScript to dist/index.js and builds the native Swift binary at dist/mouse-events (used as the CGEvent fallback for touch injection).
4

Verify the build

node dist/index.js
The server starts and listens on stdio. You should see no errors. Press Ctrl+C to stop — your MCP client will manage this process in normal use.

Connect to an MCP Client

clause mcp add preflight node /path/to/Preflight/dist/index.js
Replace /path/to/Preflight with the absolute path to where you cloned the repository. If you installed idb via pip, also add your Python bin directory to PATH — for example, ~/Library/Python/3.x/bin.
For Claude Code, place .mcp.json in your project root. After saving the config, your MCP client will automatically start the Preflight server when needed.

Your First Automation

With Preflight connected, ask your AI agent to run the following sequence. This boots a simulator and takes a screenshot to verify everything is working.
1

Boot a simulator

Ask your agent:
Boot the iPhone 16 Pro simulator.
Preflight calls simulator_boot with the device name. It polls simulator_list_devices until the device state is Booted.
2

Take a screenshot

Ask your agent:
Take a screenshot of the simulator.
Preflight calls simulator_screenshot and returns a JPEG image directly in chat — no files saved to disk.
3

Inspect the screen without vision

For a more AI-efficient approach, use the accessibility snapshot instead of a screenshot:
Take a snapshot of the current screen and describe what you see.
Preflight calls simulator_snapshot, which returns a structured accessibility tree — element roles, labels, values, and coordinates — without needing a vision model.

What’s Next

Installation

Detailed installation guide including Xcode setup, idb PATH configuration, and build output details.

Tools Reference

All 57 tools across 10 categories — observation, interaction, device management, debugging, and more.

Build docs developers (and LLMs) love