Architecture
The streaming architecture follows a producer-consumer pattern:Core Components
Producers
AveniECA provides two producer types for different use cases: Stream Producer - For continuous, time-based streaming:- Automatically syncs data at a fixed interval (
sync_rate) - Handles timing logic internally
- Best for sensors, telemetry, or periodic updates
- See Stream Producer for details
- Manual control over when data is sent
- Publish individual signals on-demand
- Best for triggered events, user actions, or irregular updates
- See Event Producer for details
Consumer
The Consumer class receives messages from Kafka topics:- Processes messages using custom handler functions
- Supports both continuous and one-time consumption
- Provides utilities for state data conversion
- See Consumer for details
When to Use Stream vs Event Producers
Use Stream Producer when:
- You need periodic, time-based updates
- Sampling sensor data at regular intervals
- Want the library to handle timing automatically
- Data source provides continuous readings
Use Event Producer when:
- Updates are triggered by events
- You control the timing externally
- Publishing individual, discrete events
- Integration with event-driven systems
Signal Data Structure
Both producers work with theSignal dataclass:
state(required): List of float values representing the state vectorvalence(optional): Float value for valence scoringscore(optional): Integer score valueemb_inp(optional): Embedding input identifier
Configuration
All streaming components require aBroker configuration object:
Quick Start Examples
Stream continuous data
Publish a single event
Consume messages
Next Steps
- Stream Producer - Continuous streaming with sync_rate
- Event Producer - Event-driven publishing
- Consumer - Consuming and processing messages
- Configuration - Broker configuration options