Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/alphaleaks60-maker/docs2/llms.txt

Use this file to discover all available pages before exploring further.

Alpha Leak is a production-grade, multi-phase trading intelligence system built for Pump.fun on Solana. It ingests every on-chain event in real time, runs them through layered analysis — wallet scoring, adversarial detection, ML inference, market regime classification — and executes trades via Jito bundles when high-conviction signals emerge. The system is not a simple copy-trading bot or a rule-based screener. It is a continuously learning intelligence pipeline that builds a persistent, ever-growing understanding of every wallet, token, and creator it observes, and uses that knowledge to make probabilistic bets on which tokens will hit specific price targets within specific time windows.

What it does

Real-time ingestion

Subscribes directly to the Pump.fun program via WebSocket, processing every trade, creation, and graduation event as it lands on-chain. No polling. No delays.

Wallet intelligence

Builds a scored, feature-rich profile for every wallet it observes — graduation rate, win rate, entry timing, hold behaviour, return consistency — updated on a rolling 30-minute cycle across up to 5,000 wallets per pass.

Adversarial detection

Detects bots, coordinated bundles, copy-traders, serial ruggers, wash traders, and exit-liquidity setups in real time. Adversarial signals feed directly into ML features and live position management.

ONNX ML inference

Runs LightGBM models compiled to ONNX for sub-millisecond inference. Models are calibrated with Platt scaling and scored every 5 seconds against a 68-feature vector. Both standard and genesis (first 60 seconds of a token’s life) models run concurrently.

Genesis scoring

A dedicated subsystem observes every newly created token for 60 seconds, accumulates early buyer behaviour and price dynamics in memory, then runs a separate family of genesis models to identify tokens likely to 3x, 5x, or 10x shortly after launch.

Live execution

Executes buys and sells on the Pump.fun bonding curve via Jito bundles. Position sizing, strategy selection, take-profit, stop-loss, and forced exits on anti-signals are all handled automatically, with full trade history persisted to PostgreSQL.

System scale

The table below reflects the operational depth of the system at steady state.
DimensionDetail
Pipeline services30+ concurrent background services
ML feature vector68 features (standard), 75 features (genesis)
Wallet scoring cadenceUp to 5,000 wallets every 30 minutes
Bundle detection window5-second clustering, 10-minute interval
Anti-signal scanEvery 30 seconds, across all active tokens
Signal crowding checkEvery 60 seconds, cached in Redis
Alpha decay profiling8 delay buckets per wallet, hourly
Market regime4 states, reclassified every 10 minutes
Copy-trade classification3 types (bot, alert, manual), 15-minute cycle
Data archiveAutomatic GCS archive after 60 days

How to navigate these docs

Architecture

Start here to understand the full pipeline as a single diagram. Every service is mapped to its phase and its role in the system.

Quickstart

Connect to the live SSE signal feed and REST API. You can receive real-time signals in under five minutes.

The Pipeline

Deep-dives into ingestion, Phase 1 scoring, Phase 2 intelligence, and Phase 3 ML processing — one page per phase.

Intelligence

How wallet scoring works, how bundles are detected, how the genesis subsystem operates, and how market regime is classified.

ML System

ONNX model architecture, the full 68-feature vector, Platt calibration, and the training pipeline.

Live Trader

Strategies, position sizing, circuit breakers, and real-time monitoring for the live execution layer.
Each subsystem has its own dedicated page. If you want to understand a specific component — how wallet scoring is computed, how bundles are detected, or how the ML models are built and served — navigate directly to that page under The Pipeline, Intelligence, or ML System.

Build docs developers (and LLMs) love