Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/toolbox-team/reddit-moderator-toolbox/llms.txt

Use this file to discover all available pages before exploring further.

Toolbox persists all subreddit-level moderation data in Reddit wiki pages. Every data store follows the same pattern: a JSON object with a ver field for schema validation, written to a specific named wiki page within the target subreddit.

Wiki pages

Data storeWiki pageFormat
UsernotesusernotesJSON with zlib-compressed blob field
Subreddit configtoolboxPlain JSON
Ban macrosbanmacrosPlain JSON
Toolbox sets each wiki page to mod-only access after writing. Your application must authenticate as a subreddit moderator to read or write these pages.

Reddit API endpoints

Reading a page

GET /r/{subreddit}/wiki/{page}.json
The response is a Reddit API envelope. The wiki content is a string at data.content_md:
{
  "data": {
    "content_md": "{\"ver\":6,\"constants\":{...},\"blob\":\"...\"}",
    "revision_id": "abc123",
    "revision_date": 1700000000
  }
}
Parse content_md as JSON to get the Toolbox data object.

Writing a page

POST /r/{subreddit}/api/wiki/edit
Required form body parameters:
ParameterDescription
pageWiki page name (e.g. usernotes, toolbox)
contentThe full page content as a string
reasonEdit reason shown in wiki revision history
uhModhash for CSRF protection
Toolbox appends " via toolbox" to all edit reasons. Third-party tools should similarly identify themselves in the reason parameter so moderators can trace edits in the wiki history.

Schema versioning

Every Toolbox data object includes a ver integer field. Toolbox validates this version before reading or writing to prevent data corruption from incompatible schema changes.
{
  "ver": 6
}
Supported version ranges (from tbcore.js):
Data storeMin supportedDeprecatedCurrentMax accepted
Usernotes4466
Config111
Reject data objects whose ver is below the minimum or above the maximum your implementation supports. Writing back data in an incompatible format will break Toolbox for all moderators of that subreddit.

Usernotes compression

Usernotes data can be large for active subreddits. Toolbox compresses the notes payload before storing it to stay within Reddit’s wiki page size limits.
1

Structure before compression

The notes data is a JSON object mapping lowercase usernames to arrays of note objects:
{
  "username": {
    "notes": [
      {"n": "note text", "t": 1700000000, "m": 0, "l": "l,abc123", "w": 1}
    ]
  }
}
2

Compress with zlib

Serialize the notes object to a JSON string, then compress it using zlib (deflate). Toolbox uses the pako library in the browser:
import pako from 'pako';

const jsonString = JSON.stringify(notesObject);
const compressed = pako.deflate(jsonString);
3

Base64-encode the result

Encode the compressed bytes as a base64 string. This becomes the blob field in the stored JSON:
const blob = btoa(String.fromCharCode(...compressed));
4

Wrap in the top-level envelope

Store the blob alongside the ver and constants fields:
{
  "ver": 6,
  "constants": {
    "users": ["moderatorname"],
    "warnings": ["none", "spamwatch", "spamwarn"]
  },
  "blob": "<base64-encoded zlib data>"
}
To read usernotes, reverse the process: base64-decode the blob, inflate with zlib, then parse the resulting JSON string.

Error handling

When readFromWiki in tbapi.ts encounters a failure, it resolves (not rejects) with one of two sentinel values:
ValueMeaning
NO_WIKI_PAGEThe page does not exist or the wiki is disabled
WIKI_PAGE_UNKNOWNThe page exists but the request failed for an unknown reason
Third-party implementations should treat NO_WIKI_PAGE as an empty data store (safe to initialize) and WIKI_PAGE_UNKNOWN as a transient error that should not trigger a write.

Build docs developers (and LLMs) love