Cadence provides command-line tools for managing database schemas across Cassandra, MySQL, PostgreSQL, and other SQL databases. These tools handle initial setup, version management, and schema migrations.Documentation Index
Fetch the complete documentation index at: https://mintlify.com/cadence-workflow/cadence/llms.txt
Use this file to discover all available pages before exploring further.
Overview
The schema tools consist of:- cadence-cassandra-tool: Schema management for Cassandra
- cadence-sql-tool: Schema management for SQL databases (MySQL, PostgreSQL, etc.)
- Database and keyspace creation
- Initial schema setup
- Version-controlled schema upgrades
- Dry-run mode for testing migrations
Installation
Via Homebrew
cadence-cassandra-tool and cadence-sql-tool.
Schema files are located at: /usr/local/etc/cadence/schema/
Via Docker
The tools are included in the server image:Build from Source
Cassandra Schema Tool
Initial Database Setup
Create Keyspace
Create keyspaces with appropriate replication:Quorum consistency for reads and writes.
Initialize Schema
Set up schema version tracking:Apply Schema
Upgrade to the latest version:Schema Upgrades
Dry Run
Test the upgrade without applying changes:Apply Specific Version
Upgrade to a specific schema version:Version Compatibility
Match schema versions with server versions:Command Options
| Option | Description |
|---|---|
--ep, --endpoint | Cassandra seed nodes (comma-separated) |
-k, --keyspace | Keyspace name |
-p, --port | Cassandra port (default: 9042) |
-u, --user | Username for authentication |
-pw, --password | Password for authentication |
-d, --schema-dir | Directory containing schema files |
-v, --version | Target schema version |
--dryrun | Preview changes without applying |
--rf, --replication-factor | Replication factor |
SQL Schema Tool
Initial Database Setup
Create Databases
Create databases for Cadence:Initialize Schema
Set up schema version tracking:Apply Schema
Upgrade to the latest version:Schema Upgrades
Dry Run
Test the upgrade:Apply Specific Version
Command Options
| Option | Description |
|---|---|
--ep, --endpoint | Database host address |
-p, --port | Database port (MySQL: 3306, PostgreSQL: 5432) |
--plugin | Database type (mysql, postgres, etc.) |
--db, --database | Database name |
-u, --user | Username (env: SQL_USER) |
-pw, --password | Password (env: SQL_PASSWORD) |
-d, --schema-dir | Directory containing schema files |
-v, --version | Target schema version |
--dryrun | Preview changes without applying |
Schema Directories
Schema files are organized by database and version:Development Workflows
Local Development Setup
Quick setup for local development:Testing Schema Changes
Before applying in production:- Dry Run: Test the migration
- Backup: Create database backup
- Stage Testing: Apply to staging environment
- Validation: Verify schema version and functionality
- Production: Apply during maintenance window
Rollback Strategy
Schema changes are forward-only. For rollback:- Restore from backup
- Revert to previous Cadence version
- Apply matching schema version
Best Practices
Production Deployments
- Always dry run migrations first
- Backup databases before schema changes
- Schedule maintenance for major upgrades
- Test in staging with production-like data
- Monitor performance during and after migration
- Use NetworkTopologyStrategy for Cassandra (not SimpleStrategy)
- Set appropriate replication factors (RF=3 recommended)
Version Management
- Track schema versions in deployment documentation
- Match server and schema versions during upgrades
- Update visibility separately from main schema
- Version control schema customizations
Security
- Use environment variables for credentials
- Restrict tool access to database administrators
- Audit schema changes in production
- Use read-only users for verification
Troubleshooting
Connection Failures
Version Conflicts
Problem: Schema version mismatch Solution:Failed Migrations
Problem: Migration failed mid-execution Solution:- Check database logs for specific error
- Verify schema version table state
- Manually fix corrupted state if needed
- Restore from backup if necessary
- Retry migration
Next Steps
- Learn about Web UI for workflow monitoring
- Explore Benchmarking for performance testing
- Configure Archival for long-term storage
- Set up Dynamic Configuration for runtime tuning