Overview
Leaderboards track forecaster performance across the platform and within specific tournaments. Metaculus uses sophisticated scoring algorithms to measure forecast accuracy and reward skilled predictors.Get Global Leaderboard
GET /api/leaderboards/global/
Retrieve the global Metaculus leaderboard showing top forecasters across all questions.
Query Parameters
Show leaderboard position for a specific user
Score type to display:
peer, baseline, spot_peer, spot_baselineNumber of entries to return
Pagination offset
Response
Array of leaderboard entry objects
Leaderboard Entry Object
Position on the leaderboard (1 = first place)
Total forecasting score
Lower bound of 95% confidence interval for score
Upper bound of 95% confidence interval for score
Proportion of scored questions the user forecasted (0-1)
Number of questions contributing to this score
Medal earned:
gold, silver, bronze, or nullPrize amount earned (for tournaments)
Whether this entry is excluded from rankings
Get Project Leaderboard
GET /api/leaderboards/project/{projectId}/
Retrieve the leaderboard for a specific project or tournament.
Path Parameters
The project ID
Query Parameters
Same as global leaderboard, plus:Only show the primary leaderboard for this project
Response
Returns the same structure as the global leaderboard, but includes additional fields:The project this leaderboard belongs to
Project name
Project slug
Total prize pool for this tournament
Whether this is the project’s primary leaderboard
For tournaments: user’s tournament-specific score
For tournaments: percentage of prize pool won
Get User Medals
GET /api/medals/
Retrieve medal counts for a user.
Query Parameters
User ID to get medals for
Response
Number of gold medals
Number of silver medals
Number of bronze medals
Get Metaculus Track Record
GET /api/metaculus_track_record/
Retrieve Metaculus’s overall forecasting track record and performance statistics.
Response
Calibration curve data showing how often predictions match outcomes
Understanding Score Types
Peer Score vs Baseline ScoreMetaculus uses multiple scoring methods:
- Peer Score: Measures performance relative to the community aggregate
- Baseline Score: Measures performance relative to a baseline prior
- Spot Score: Evaluated at a specific time (CP reveal time) rather than continuously
Score Calculation
Scores are calculated using:- Log Score: Rewards accuracy with proper scoring rules
- Coverage: Weights scores by how many questions you forecasted
- Recency: More recent forecasts may have higher weight
