Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/cowprotocol/solver-rewards/llms.txt

Use this file to discover all available pages before exploring further.

The solver rewards pipeline requires Python >= 3.10. If your system Python is older, or you want a fully reproducible environment without managing a local virtualenv, use the official Docker image.

Docker image

The image is published to the GitHub Container Registry:
ghcr.io/cowprotocol/solver-rewards:main
Always use --pull=always to ensure you are running the latest build from the main branch.

Setup and run

1

Copy the environment file

The Docker container reads credentials from a .env file passed via --env-file. Start by copying the sample:
cp .env.sample .env
Then fill in your credentials. The minimum required fields are:
# .env
FILE_OUT_PATH=./out
NETWORK=mainnet           # mainnet | gnosis | arbitrum | base | avalanche | polygon | bnb | linea | plasma | ink
NODE_URL=                 # RPC endpoint for the target network
NODE_URL_MAINNET=         # RPC endpoint for mainnet
DUNE_API_KEY=             # Dune Analytics API key
PAYOUTS_SAFE_ADDRESS=0xA03be496e67Ec29bC62F01a428683D7F9c204930
PAYOUTS_SAFE_ADDRESS_MAINNET=0xA03be496e67Ec29bC62F01a428683D7F9c204930
BARN_DB_URL=              # Orderbook (barn) database connection string
PROD_DB_URL=              # Orderbook (production) database connection string
ANALYTICS_DB_URL=         # Analytics database connection string
PROPOSER_PK, SAFE_API_KEY, SLACK_TOKEN, and SLACK_CHANNEL are only required when using --post-tx or Slack notification flags.
2

Run the container

Mount your current directory as /app/out so the output CSV files are written to your local filesystem:
docker run --pull=always -it --rm \
  --env-file .env \
  -v $PWD:/app/out \
  ghcr.io/cowprotocol/solver-rewards:main \
  src.fetch.transfer_file \
  --start 'YYYY-MM-DD'
Replace YYYY-MM-DD with the accounting period start date (e.g. 2023-03-14).After roughly 30 seconds, the two transfer CSVs appear in your current directory:
transfers-mainnet-2023-03-14-to-2023-03-21-COW.csv
transfers-mainnet-2023-03-14-to-2023-03-21-NATIVE.csv
3

(Optional) Post transaction directly to Safe

To auto-propose the multisend instead of generating CSV files, add PROPOSER_PK to your .env and pass --post-tx:
docker run --pull=always -it --rm \
  --env-file .env \
  -v $PWD:/app/out \
  ghcr.io/cowprotocol/solver-rewards:main \
  src.fetch.transfer_file \
  --start 'YYYY-MM-DD' \
  --post-tx

Docker vs local Python

# No Python environment setup required.
# Always runs with Python 3.12 (as defined in the Dockerfile).
docker run --pull=always -it --rm \
  --env-file .env \
  -v $PWD:/app/out \
  ghcr.io/cowprotocol/solver-rewards:main \
  src.fetch.transfer_file \
  --start 'YYYY-MM-DD'

Volume mount

The -v $PWD:/app/out flag maps your current working directory into the container at /app/out, which is the default FILE_OUT_PATH inside the image. This means output CSVs are written directly to wherever you ran the docker run command on the host. If you set a different FILE_OUT_PATH in your .env, adjust the mount target accordingly.

Building the test database container

Some tests require a local PostgreSQL instance seeded with fixture data. The repo provides a dedicated Dockerfile.db for this:
# Build the test DB image
docker build -t test_db -f Dockerfile.db .

# Start it as a background container on port 5432
docker run -d --name testDB -p 5432:5432 test_db

# Run the full test suite against it
python -m pytest tests/
The test DB container is only needed for tests that require a live database connection. Unit tests (make test-unit) do not require it.

Build docs developers (and LLMs) love