Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/cowprotocol/solver-rewards/llms.txt

Use this file to discover all available pages before exploring further.

The data sync script uploads CoW Protocol order data for the current month to Dune Analytics. This data powers the solver rewards dashboard used to cross-check payout computations each week.

Running the sync

python -m src.data_sync.sync_data --sync-table order_data
This uploads order data for the current month to Dune. The script reads from the configured orderbook databases (BARN_DB_URL, PROD_DB_URL, ANALYTICS_DB_URL) and pushes the result to your Dune account via the Dune API.

When to run

Run the data sync once per month, before running the weekly payout analysis. Because the solver rewards dashboard queries data that has been uploaded to Dune, the payout validation step (cross-checking against the dashboard) depends on the data being current.
The data sync is separate from the payout computation. The payout script (src.fetch.transfer_file) queries both the orderbook database and Dune directly. The data sync script specifically refreshes the Dune-hosted copy of order data that the dashboard visualizes.

Prerequisites

The same .env configuration used for payout generation applies here. Ensure the following are set:
DUNE_API_KEY=          # Required to upload data to Dune
ANALYTICS_DB_URL=      # Source database for order data
BARN_DB_URL=           # Orderbook (barn) environment
PROD_DB_URL=           # Orderbook (production) environment
NETWORK=mainnet        # Target network

Relationship to payout computation

The payout pipeline has two data paths:

Dune Analytics

Used for on-chain data queries: trade events, block ranges, solver activity. The order_data table synced by this script feeds the solver rewards dashboard at dune.com/cowprotocol/cow-solver-rewards.

Orderbook database

Used for off-chain data: order details, settlement data from the CoW Protocol backend. Queried directly by the payout script via ANALYTICS_DB_URL, BARN_DB_URL, and PROD_DB_URL.
The data_sync step keeps the Dune-hosted copy of order data fresh so that the dashboard reflects the latest trades. Without a recent sync, the dashboard totals shown during payout validation may lag behind the orderbook.

Dashboard

After syncing, the data is visualized at:
https://dune.com/cowprotocol/cow-solver-rewards
The payout script generates a pre-filtered URL for each run that includes the accounting period, blockchain, quote reward, and quote cap parameters. This makes it straightforward to verify that the computed transfers match what Dune shows for the same period.

Docker

You can also run the data sync via Docker:
docker run --pull=always -it --rm \
  --env-file .env \
  -v $PWD:/app/out \
  ghcr.io/cowprotocol/solver-rewards:main \
  src.data_sync.sync_data \
  --sync-table order_data
Run the data sync at the start of each month before the first weekly payout of the month. This ensures the dashboard is fully up to date when you perform cross-checks during payout validation.

Build docs developers (and LLMs) love