Instructors author exercises by writing a plain-text problem prompt — the same instruction a student would read. Opus reads it and returns the scaffolding that will gate the student’s specification: a set of concrete questions the student must answer before the code editor unlocks, the divergence patterns Opus expects novices to produce, and an inferred difficulty level. You review each field, edit anything that needs adjustment, then publish. The exercise appears in the student exercise list immediately.Documentation Index
Fetch the complete documentation index at: https://mintlify.com/bcanata/maieutic/llms.txt
Use this file to discover all available pages before exploring further.
The authoring workflow
Navigate to /authoring
Open
/authoring in your browser. The instructor navigation bar links here from any instructor page.Enter a title and problem prompt
Write the exercise title and instructions in plain text, exactly as a student would read them. For example:Select the curriculum unit this exercise belongs to. The unit tells Opus which Python tools the student has been taught so far and constrains the dimensions it generates accordingly.
Generate scaffolding
Click Generate scaffolding. Opus returns a
ScaffoldingOutput object — the response is validated against the schema at the boundary before the UI renders it. Latency is displayed next to the button so you know how long the call took.Review and edit each field
Three sections appear: specification-gate dimensions, expected divergences, and unit/student level. Every field is editable inline. Each item shows a source badge —
Opus, Edited, or Added — so the authoring trace is preserved.If Opus notes ambiguity in your prompt, a Prompt quality note banner appears above the sections. You can publish anyway, but the note describes why the scaffolding may have lower pedagogical value.What Opus generates
TheScaffoldingOutput schema defines the four top-level fields Opus returns:
Spec gate dimensions
Spec gate dimensions
Each dimension is a concrete question the student’s natural-language specification must answer before the code editor unlocks. A dimension has three fields:
id— asnake_caseslug used to track which dimensions a student has addressed across spec iterations.description— the question itself, specific enough that “assume valid input” or a concrete answer are both acceptable commitments. Generic labels like “handle edge cases” are explicitly forbidden by Opus’s prompt.rationale— why this question matters pedagogically. Rationale is used by Opus when asking follow-up questions; it is not shown verbatim to students.
Opus to Edited. When you add a dimension yourself the badge shows Added. The original Opus output is preserved separately so the authoring trace is never lost.Expected divergences
Expected divergences
Each divergence is a specific pattern Opus anticipates will appear in student code. Divergences are categorised as:
drift— code does less than the spec required (the most common category).revision— code implements a coherent alternative that still satisfies the spec — a genuine refactor.bug— code attempts what was specified but fails.
Student level and unit
Student level and unit
Opus infers a
student_level from the prompt — one of week_1_2, week_3_6, or week_7_plus. The level feeds into the curriculum unit the exercise is assigned to, and from there into how Opus calibrates dimensions: a unit_1 exercise (Python fundamentals, no loops yet) gets different dimensions from a unit_4 exercise (user-defined functions).You can override the unit using the radio buttons. Changing the unit updates the student level automatically. The four units map to:| Unit | Title | Key additions |
|---|---|---|
| I | Python Fundamentals | Variables, math, type casting, strings, try/except |
| II | Control Structures | if/elif/else, for/while loops |
| III | Data Structures | Lists and dictionaries |
| IV | Functions | def, parameters, return, scope |
Prompt quality note
Prompt quality note
When a prompt is vague or ambiguous, Opus sets
prompt_quality_note to a string describing the problem. Maieutic renders this as a warning banner above the generated scaffolding. Opus never refuses to generate scaffolding on an ambiguous prompt — it produces the best output it can and flags the issue. You decide whether to publish.Source tracking
Every dimension and divergence carries asource field throughout its lifecycle: