Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/TrustifAI/trustifai/llms.txt

Use this file to discover all available pages before exploring further.

Welcome to the TrustifAI documentation. TrustifAI is a Python SDK for evaluating the trustworthiness of LLM and RAG responses through multi-dimensional trust scoring, interactive visualizations, and extensible custom metrics.

Introduction

Learn what TrustifAI is and how it evaluates AI trustworthiness.

Quickstart

Score your first RAG response in under five minutes.

Core Concepts

Understand how Trust Score is computed from four independent trust signals.

API Reference

Explore the full public API: Trustifai, AsyncTrustifai, MetricContext, and more.

Build docs developers (and LLMs) love