PRMS (Planning, Reporting & Management System) is the unified web platform that solves fragmented CGIAR result reporting. Before PRMS, Initiative and Center teams spread their results across spreadsheets, free-text documents, and bespoke tools — producing inconsistent typologies, weak quality controls, and significant manual effort at every submission cycle boundary. PRMS replaces that fragmentation with a single structured system: typed result capture, a defined quality assurance workflow, Theory of Change alignment, and stable payload surfaces for downstream consumers, all scoped to a reporting phase.Documentation Index
Fetch the complete documentation index at: https://mintlify.com/AllianceBioversityCIAT/onecgiar_pr/llms.txt
Use this file to discover all available pages before exploring further.
The problem PRMS solves
Fragmented data across initiatives
Fragmented data across initiatives
Result data was scattered across spreadsheets and Initiative-specific tools, making portfolio-level aggregation unreliable. PRMS captures all results in a single MySQL-backed system of record with a consistent schema.
Inconsistent result typology
Inconsistent result typology
The same research output could be reported as different types by different teams, breaking downstream BI and bilateral reporting. PRMS enforces 11 canonical result types drawn from a shared enum, ensuring uniform classification across the portfolio.
Weak quality controls
Weak quality controls
Evidence, Theory of Change alignment, partner attribution, and DAC cross-cutting scores were captured inconsistently and reviewed too late. PRMS validates these fields server-side before a result can leave the Editing status.
Friction at phase boundaries
Friction at phase boundaries
Phase rollover, version snapshots, and bilateral exports required manual reconciliation each reporting cycle. PRMS handles this with a dedicated versioning module that snapshots prior-phase data without mutation and opens new phases cleanly.
Who uses PRMS
PRMS serves four primary personas, each with a distinct role in the reporting lifecycle.Result submitter
Initiative and Center staff who create typed results, attach evidence, align results to Theory of Change outcomes, mark contributing partners and geography, and resolve QA feedback before resubmission.
QA reviewer
Reviewers who open a result-review drawer, assess all submitted fields and evidence in one place, add structured comments tied to specific fields or sections, and advance or return results in the workflow.
PMU / portfolio lead
Portfolio managers who run Type-One Reports, oversee IPSR pathways across their four steps, edit global narratives and impact-area scoring, and monitor submission progress across initiatives per phase.
Platform admin
Administrators who manage roles and AD users, open and close reporting phases, trigger CLARISA catalog syncs, configure global parameters, and recover soft-deleted results through the manage-data surface.
Bilateral funders and platform-report consumers are secondary, non-interactive personas. They read the typed payload APIs at
/api/bilateral/* and /api/platform-report/*, which are JWT-excluded headless surfaces designed for downstream systems and BI dashboards.Result types
PRMS captures results across 11 canonical types defined inResultTypeEnum. Each type carries shared common fields (title, ToC alignment, geography, contributing centers and partners, DAC scores, evidence) plus a type-specific section.
- Outcomes and outputs
- Knowledge and innovation
- IPSR types
| Type ID | Name | Description |
|---|---|---|
| 1 | Policy change | Changes to laws, regulations, policies, or institutional arrangements influenced by CGIAR research |
| 2 | Innovation use | Adoption or use of an innovation by end users or scaling partners |
| 3 | Capacity change | Strengthened capacities of individuals, organizations, or systems |
| 4 | Other outcome | Outcomes that do not fit the above categories |
| 8 | Other output | Outputs that do not fit a more specific output type |
| 9 | Impact contribution | Documented contributions to impacts at scale |
Submission status flow
Every result moves through a defined status lifecycle. Status transitions are enforced server-side and every transition is recorded in the review history with the acting user, timestamp, and optional comment.Editing (status 1)
All results start here. The submitter fills required common fields, type-specific fields, evidence, ToC alignment, geography, and partner attribution. The result can be saved and returned to at any time during an open phase.
Quality Assessed (status 2)
The submitter requests QA review. The QA reviewer opens the result in the review drawer, assesses all fields, and either advances the result or returns it with structured field-level comments. A returned result goes back to Editing for the submitter to resolve.
Two additional statuses exist: Discontinued (4) for results that are no longer being pursued, and Pending Review (5), Approved (6), and Rejected (7) for extended review pipeline configurations. The primary workflow for most results is Editing → Quality Assessed → Submitted.
Explore PRMS
Quick start
Step-by-step guide to logging in, creating a result, filling required fields, and submitting for QA review
Architecture
Full-stack design: Angular 19 SPA, NestJS 11 backend, MySQL, and all external integrations
Core features
Detailed coverage of result capture, QA workflow, IPSR pathways, and phase management
User guides
Role-specific guides for submitters, QA reviewers, PMU leads, and platform admins