Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/wtyler2505/ProtoPulse/llms.txt

Use this file to discover all available pages before exploring further.

The Architecture Editor is where every ProtoPulse project begins. It gives you an interactive block diagram canvas — powered by React Flow — where you lay out the major components of your system and define how they communicate before touching a single schematic wire.

The block diagram canvas

The canvas is a freeform, pannable, zoomable surface. Each block represents a high-level system component (an MCU, a sensor module, a power regulator). Edges between blocks carry a signal type that defines what protocol or power domain connects them. This is intentionally high-level. The Architecture view answers “what components are in my design and how do they talk?” — the Schematic view handles the detailed wiring.

Component categories

Open the Asset Manager panel (left sidebar) to browse the built-in component library. Components are organized into seven categories:
CategoryExamples
MCUArduino Mega, ESP32, STM32, ATmega328
SensorTemperature, IMU, distance, light sensors
PowerLDO regulators, DC-DC converters, battery management
CommunicationWi-Fi, Bluetooth, CAN, RS-485 modules
ConnectorUSB-C, JST, barrel jack, pin headers
MemoryFlash, EEPROM, SD card, FRAM
ActuatorMotor drivers, servos, relays, solenoids
Drag any component from the panel onto the canvas to place it.

Signal edge types

When you connect two components by drawing a line from one port to another, ProtoPulse asks you to choose a signal type. The edge color and label reflect the protocol:
Signal typeColorTypical use
SPIBlueHigh-speed sensor readout, SD cards, flash memory
I2CPurpleLow-pin-count sensors, EEPROMs, displays
UARTOrangeDebug output, GPS modules, serial peripherals
USBGreenUSB-C connectivity, DFU programming
PowerRedSupply rails — 3.3 V, 5 V, VBAT
GPIOGrayDigital control lines, interrupts, LEDs

Adding and connecting components

1

Open the Asset Manager

Click the Assets tab in the left sidebar to expand the component library browser.
2

Drag a component onto the canvas

Find the component you want and drag it onto the canvas. It appears as a labeled block with connection ports on its edges.
3

Draw a connection

Hover over a port handle on the source component until it highlights, then drag to a port on the destination component. Release to create the edge. A dialog prompts you to choose the signal type.
4

Label and annotate

Double-click any node or edge to rename it or add a description. Use descriptive labels — the AI assistant uses these names when generating schematics.

Context menu actions

Right-click any node or edge to open the context menu:
  • Edit properties — change name, description, component category
  • Duplicate — copy the node with its configuration
  • Delete — remove the node and all its edges
  • Set pin map — assign schematic pin assignments to architecture ports (useful before AI schematic generation)

Working with the AI assistant

The AI Assistant has full read/write access to the architecture diagram. You can describe a system in plain English and watch it build:
“Add an ESP32 as the main controller, a BME280 temperature sensor connected via I2C, a 3.3 V LDO regulator powered from USB-C, and a status LED on GPIO.”
The AI will place each node and draw the typed edges automatically. See AI Overview for more.

Transitioning to the schematic

Once your block diagram captures the system intent, switch to Schematic view to start detailed circuit capture. The architecture nodes serve as a reference — the AI can use your block diagram to generate a starting schematic, preserving the component choices and signal types you defined.
Label your architecture nodes with specific part numbers (e.g., “ESP32-S3” instead of just “MCU”) before asking the AI to generate the schematic. It uses these names to select appropriate library components.

Schematic Capture

Move from block diagram to detailed circuit schematic.

AI Assistant

Let the AI generate architecture diagrams from a text description.

Build docs developers (and LLMs) love