Skip to main content

Overview

pyrig is built around three interconnected systems that enable powerful extensibility and automation:

Config System

Declarative file management with automatic discovery and merging

CLI System

Automatic command registration across package dependencies

Inheritance System

Multi-package inheritance with automatic discovery (.I and .L patterns)

Config File System

The ConfigFile Architecture

Every generated file in pyrig is backed by a ConfigFile subclass. This provides:
  • Automatic discovery: All ConfigFile subclasses are found across dependent packages
  • Subset validation: User files only need required keys, not all keys
  • Intelligent merging: pyrig adds missing keys while preserving user additions
  • Priority-based validation: Files validate in order (e.g., pyproject.toml before dependent configs)
  • Parallel execution: File validation runs concurrently for performance

How ConfigFile Works

Here’s the complete lifecycle:
1

Define expected configuration

Create a ConfigFile subclass that declares what should exist:
from pathlib import Path
from pyrig.rig.configs.base.toml import TomlConfigFile

class MyConfigFile(TomlConfigFile):
    def parent_path(self) -> Path:
        return Path()  # Project root

    def _configs(self) -> dict:
        return {
            "tool": {
                "myapp": {
                    "version": "1.0.0",
                    "debug": False
                }
            }
        }

    def priority(self) -> float:
        return 50  # Validate after pyproject.toml
2

Run validation

When pyrig mkroot runs, for each ConfigFile:
  1. Load existing file (if it exists)
  2. Merge configurations: user_config | expected_config
  3. Validate subset: Ensure expected_config ⊆ merged_config
  4. Write back: Save merged config (user additions preserved)
3

Opt-out behavior

Users can opt-out by creating an empty file:
# Opt out of my-config.toml
touch my-config.toml
Empty files are left unchanged by validation.

ConfigFile Hierarchy

The system has four layers:
ConfigFile (base.py)Abstract base defining the config lifecycle:
class ConfigFile[ConfigT]:
    @abstractmethod
    def parent_path(self) -> Path:
        """Directory containing the file."""

    @abstractmethod
    def _configs(self) -> ConfigT:
        """Expected configuration."""

    @abstractmethod
    def _load(self) -> ConfigT:
        """Load and parse file."""

    @abstractmethod
    def _dump(self, config: ConfigT) -> None:
        """Write configuration."""

Subset Validation

The key innovation is subset validation rather than exact matching:
def nested_structure_is_subset(subset: dict, superset: dict) -> bool:
    """Check if subset structure exists within superset."""
    for key, value in subset.items():
        if key not in superset:
            return False
        if isinstance(value, dict):
            if not nested_structure_is_subset(value, superset[key]):
                return False
        elif superset[key] != value:
            return False
    return True
This allows:
  • User additions: Extra keys beyond expected config
  • Required structure: Expected keys must exist with correct values
  • Validation: Final file must contain all expected configuration

Example: pyproject.toml

pyrig/rig/configs/pyproject.py
class PyprojectToml(TomlConfigFile):
    def parent_path(self) -> Path:
        return Path()

    def _configs(self) -> ConfigDict:
        return {
            "project": {
                "name": PackageName.I.name,
                "version": "0.1.0",
                "dependencies": ["typer>=0.21.1"],
            },
            "build-system": {
                "requires": ["uv_build"],
                "build-backend": "uv_build",
            },
        }

    def priority(self) -> float:
        return 100  # Validate first
When you run pyrig mkroot:
  1. Loads existing pyproject.toml
  2. Merges with expected structure
  3. Preserves your custom dependencies, scripts, etc.
  4. Ensures required fields exist
  5. Writes back merged result

CLI System

Automatic Command Discovery

The CLI system automatically discovers and registers commands from three sources:
1

Main entry point

main() from <package>.main
my_project/main.py
def main() -> None:
    """Run the main entrypoint for the project."""
    print("Hello from my project!")
Registered as the default command (package name).
2

Project-specific commands

Functions from <package>.rig.cli.subcommands
my_project/rig/cli/subcommands.py
def deploy() -> None:
    """Deploy the application."""
    # Deployment logic
uv run my-project deploy
3

Shared commands

Functions from <package>.rig.cli.shared_subcommands across all packages
my_project/rig/cli/shared_subcommands.py
def version() -> None:
    """Display project version."""
    project_name = project_name_from_argv()
    print(f"{project_name} version {_version(project_name)}")
Available in all dependent projects automatically.

CLI Registration Flow

Here’s how the CLI discovers and registers commands:
pyrig/rig/cli/cli.py (simplified)
app = typer.Typer()

def add_subcommands() -> None:
    # Extract package name from sys.argv[0]
    package_name = package_name_from_argv()

    # 1. Register main() from <package>.main
    main_module = import_module(f"{package_name}.main")
    app.command()(main_module.main)

    # 2. Register functions from <package>.rig.cli.subcommands
    subcommands = import_module(f"{package_name}.rig.cli.subcommands")
    for func in all_functions_from_module(subcommands):
        app.command()(func)

def add_shared_subcommands() -> None:
    package_name = package_name_from_argv()
    package = import_module(package_name)

    # 3. Find all packages in dependency chain (pyrig -> ... -> current)
    all_modules = discover_equivalent_modules_across_dependents(
        shared_subcommands, pyrig, until_package=package
    )

    # Register all functions from each module
    for module in all_modules:
        for func in all_functions_from_module(module):
            app.command()(func)

Logging Configuration

The CLI includes flexible verbosity control:
uv run pyrig mkroot
INFO level, clean formatting (just messages)

Multi-Package Inheritance System

The .I and .L Patterns

The most powerful feature of pyrig is automatic discovery of implementations across package boundaries.

DependencySubclass Base

All extensible classes inherit from DependencySubclass:
pyrig/src/subclass.py
class DependencySubclass(ABC):
    @classmethod
    @abstractmethod
    def definition_package(cls) -> ModuleType:
        """Package where implementations live."""

    @classmethod
    @abstractmethod
    def sorting_key(cls, subclass: type[T]) -> Any:
        """Sort key for ordering discovered subclasses."""

    @classproperty
    @cache
    def L(cls: type[Self]) -> type[Self]:
        """Get the final leaf subclass (deepest in inheritance tree)."""
        # Discovery logic...

    @classproperty
    @cache
    def I(cls: type[Self]) -> Self:
        """Get an instance of the final leaf subclass."""
        return cls.L()

How It Works

1

Define base class

pyrig/rig/tools/package_manager.py
class PackageManager(Tool):
    def name(self) -> str:
        return "uv"

    def install_dependencies_args(self) -> Args:
        return self.args("sync")
2

Override in your package

my_project/rig/tools/package_manager.py
from pyrig.rig.tools.package_manager import PackageManager

class MyPackageManager(PackageManager):
    def install_dependencies_args(self) -> Args:
        return self.args("sync", "--frozen")
3

Access automatically

# Anywhere in the codebase
PackageManager.I.install_dependencies_args()
# Uses MyPackageManager implementation automatically!

Discovery Process

The discovery searches:
  1. Base dependency (typically pyrig)
  2. All dependent packages in the dependency chain
  3. Scoped to definition package (e.g., rig.tools)
  4. Returns leaf (most specific implementation)

Practical Example: Tool Wrappers

pyrig/rig/tools/base/base.py
class Tool(DependencySubclass):
    @abstractmethod
    def name(self) -> str:
        """Tool command name."""

    def args(self, *args: str) -> Args:
        """Build Args object from command parts."""
        return Args((self.name(), *args))

    @classmethod
    def definition_package(cls) -> ModuleType:
        return tools  # pyrig.rig.tools
All tools inherit this pattern:
class Linter(Tool):
    def name(self) -> str:
        return "ruff"

    def check_args(self) -> Args:
        return self.args("check")

# Usage:
Linter.I.check_args().run()

Creating Organization Standards

The real power emerges when creating organization-wide standards:
company-pyrig/rig/tools/linter.py
from pyrig.rig.tools.linter import Linter

class CompanyLinter(Linter):
    def check_args(self) -> Args:
        # Add company-specific rules
        return self.args("check", "--config", "company-rules.toml")
Now any project that depends on company-pyrig automatically uses company rules:
any-project/pyproject.toml
[project]
dependencies = [
    "company-pyrig",  # All company standards applied
]

Benefits of This Architecture

Declarative

Define what should exist, not how to create it. pyrig handles the implementation.

Idempotent

Safe to run repeatedly. Changes are preserved, missing structure is added.

Extensible

Override any behavior by subclassing. Changes propagate automatically.

Discoverable

All implementations found automatically. No manual registration needed.

Composable

Build on existing standards. Company package extends pyrig, project extends company package.

Type-Safe

Full type checking support. IDE autocomplete works throughout.

Design Principles

1. Separation of Concerns

  • ConfigFile: Declares what files should exist
  • Tool: Constructs command arguments
  • Builder: Creates build artifacts
  • CLI: Routes commands to implementations
Each system is independent but composable.

2. Discovery Over Registration

No manual registration required. Define a subclass and it’s automatically discovered:
# This is all you need
class MyConfig(TomlConfigFile):
    # Implementation
Compare to manual registration:
# What you DON'T need to do
register_config(MyConfig)  # Not needed!
CONFIG_REGISTRY.append(MyConfig)  # Not needed!

3. Priority-Based Ordering

Configs validate in priority order (high to low):
class PyprojectToml(TomlConfigFile):
    def priority(self) -> float:
        return 100  # Validates first

class MyConfig(TomlConfigFile):
    def priority(self) -> float:
        return 50  # Validates after pyproject.toml

4. Caching and Performance

  • ConfigFile.configs(): Cached (validated once)
  • ConfigFile.load(): Cached (file read once)
  • DependencySubclass.L: Cached (discovery runs once)
  • Parallel validation: All ConfigFiles validated concurrently

Next Steps

Config Reference

Detailed documentation on creating custom config files

Tool Reference

Learn about available tools and how to customize them

CLI Guide

Advanced CLI patterns and custom commands

Testing

Understanding autouse fixtures and test infrastructure

Build docs developers (and LLMs) love