Overview
Analytics in Chronoverse offer insights into:- User-level metrics: Total workflows, jobs, logs, and execution time across all workflows
- Workflow-level metrics: Per-workflow job counts, log volumes, and execution durations
- Real-time aggregation: Metrics updated as events occur through Kafka stream processing
- Efficient storage: ClickHouse provides fast aggregation queries even with millions of jobs
Analytics are computed asynchronously by the Analytics Processor, which consumes workflow, job, and log events from Kafka topics.
User Analytics
Get aggregate statistics for all workflows owned by a user.Endpoint
Get User Analytics
Response Format
User Analytics
Metrics Explained
| Metric | Type | Description |
|---|---|---|
total_workflows | integer | Count of all workflows ever created (active, terminated, deleted) |
total_jobs | integer | Total jobs executed across all workflows |
total_joblogs | integer | Total log entries generated across all jobs |
total_job_execution_duration | integer | Sum of all job execution times in seconds |
- total_workflows
- total_jobs
- total_joblogs
- total_job_execution_duration
Counts distinct workflows including:
- Active workflows (not terminated)
- Terminated workflows
- Deleted workflows (soft-deleted)
ClickHouse Query
Use Cases
Resource Planning
Understand total compute usage to plan infrastructure capacity
Cost Tracking
Calculate execution costs based on total runtime and resource usage
Usage Trends
Track growth in workflows and job execution over time
Log Volume
Monitor log storage requirements based on total log entries
Workflow Analytics
Get detailed metrics for a specific workflow.Endpoint
Get Workflow Analytics
Response Format
Workflow Analytics
Metrics Explained
| Metric | Type | Description |
|---|---|---|
workflow_id | string | UUID of the workflow |
total_jobs | integer | Total jobs executed by this workflow |
total_joblogs | integer | Total log entries from all jobs |
total_job_execution_duration | integer | Sum of execution times in seconds |
Query Implementation
ClickHouse Query
Each workflow has a single analytics record that’s continuously updated as jobs execute and generate logs.
Derived Metrics
You can calculate additional insights from workflow analytics:Analytics Processing Pipeline
Analytics are computed asynchronously through event stream processing:Architecture
Event Generation
Services publish events to Kafka topics:
- Workflow events (create, update, delete)
- Job events (start, complete, fail)
- Log events (log entries created)
Event Processing
- Workflow Events
- Job Events
- Log Events
Workflow Created
Workflow Deleted
Analytics Table Schema
The ClickHouse analytics table stores aggregated data:Table Structure
Indexing Strategy
- Primary Key:
(user_id, workflow_id)for efficient user and workflow queries - Order By: Same as primary key for optimal compression
- Engine: MergeTree for fast aggregations and updates
Error Handling
Not Found Errors
If no analytics exist for a user or workflow:Error Response
- Workflow was just created (analytics not yet processed)
- Invalid workflow ID
- User doesn’t own the workflow
Analytics records are created asynchronously. New workflows may not have analytics immediately available.
Invalid Request
Error Response
- Malformed UUID
- Empty user_id or workflow_id
Real-Time vs. Eventual Consistency
Consistency Examples
Performance Characteristics
Query Performance
| Operation | Typical Latency | Scale |
|---|---|---|
| Get User Analytics | Less than 10ms | Millions of workflows |
| Get Workflow Analytics | Less than 5ms | Billions of jobs |
| Analytics Update | Less than 100ms | High throughput |
Scalability
ClickHouse enables analytics at scale:- User Analytics: Aggregates across thousands of workflows
- Workflow Analytics: Handles billions of job records
- Log Counting: Efficiently counts trillions of log entries
Best Practices
Cache on Client
Analytics change slowly—cache results for 30-60 seconds to reduce API calls
Calculate Rates
Derive metrics like average execution time, logs per job, and efficiency ratios
Expect Delays
Don’t expect instant analytics updates—allow 1-5 seconds for consistency
Monitor Trends
Track analytics over time to identify performance trends and anomalies
Limitations
Current Limitations:
- No time-series breakdowns (daily, weekly, monthly)
- No job status-specific counts (success vs. failure rates)
- No percentile metrics (p50, p95, p99 execution times)
- No workflow kind filtering
- No custom date ranges
Example Use Cases
Dashboard Widgets
User Overview Widget
Workflow Comparison
Compare Workflows
Next Steps
Workflow Types
Learn about HEARTBEAT and CONTAINER workflows
API Reference
Complete API documentation for analytics endpoints