sync_indicators module generates visual diagnostic images that help verify and understand video synchronization results. It detects motion regions, annotates frames with bounding boxes, and creates comparison composites.
generate_sync_indicators
Main entry point for generating all visual sync indicators. Creates annotated images showing detected motion regions and synchronization points.Parameters
Directory containing original video files
List of video filenames being synchronized
Dictionary mapping filename to computed synchronization offset in seconds
Output directory for generated indicator images (typically project root
/results)Number of peak-motion frames to capture and annotate per video
Returns
List of file paths for all generated indicator images
Generated Images
The function creates several types of visualization:1. Per-Video Motion Peaks
For each video,num_peaks frames with highest motion activity:
- Filename:
{video_name}_motion_peak_{1-5}.png - Content: Annotated frame with bounding boxes around detected motion regions
- Annotations:
- Header bar with video name, frame number, and timestamp
- Cyan bounding boxes around motion regions
- Intensity percentage labels
- Region count badge
2. Motion Timeline Strips
Timeline showing all peak motion moments as thumbnails:- Filename:
{video_name}_motion_timeline.png - Content: Horizontal strip with thumbnail of each peak frame
- Annotations: Timestamp and region count for each peak
3. Sync Point Comparisons
Side-by-side comparison of all videos at corresponding sync points:- Filename:
sync_comparison_{1-5}.png - Content: All videos shown at the same sync-adjusted time
- Purpose: Verify that motion events align across cameras
4. Offset Summary Chart
Bar chart visualizing computed offsets:- Filename:
offset_summary.png - Content: Horizontal bar chart with color-coded offsets
- Cyan: Reference video (zero offset)
- Green: Delayed videos (positive offset)
- Red: Trimmed videos (negative offset)
Visual Styling
The indicators use consistent styling matching the project’s brand:Motion Detection Algorithm
The module uses frame differencing to detect motion:- Frame Differencing: Computes absolute difference between consecutive frames
- Gaussian Blur: Reduces noise in difference image
- Thresholding: Binary threshold to isolate motion pixels
- Morphological Operations: Dilates to connect nearby regions
- Contour Detection: Finds bounding boxes around motion blobs
- Filtering: Removes regions smaller than
min_area(default 500 pixels) - Intensity Calculation: Computes mean pixel difference within each region
Usage in Flask UI
The sync indicators are automatically generated after successful synchronization:src/ui.py
Programmatic Usage
Helper Functions
The module includes several internal helper functions that power the main functionality:_find_motion_regions
(x, y, w, h, intensity) tuples sorted by intensity
_draw_bounding_boxes
_find_peak_motion_frames
frame_idx, timestamp, frame, prev_frame, energy, regions
_create_comparison_composite
_create_motion_timeline
output_path)
_create_offset_summary
results_dir/offset_summary.png)
Performance
Indicator generation is CPU-intensive but runs only once after sync completes: Typical timing for 3 videos (5 minutes each, 1080p):- Peak frame detection: ~15-30 seconds
- Frame annotation: ~2-5 seconds
- Composite generation: ~1-2 seconds
- Total: ~20-40 seconds
The bottleneck is video frame reading. The
step parameter (default 2) controls frame sampling - higher values process faster but may miss brief motion events.Output Directory
Indicators are saved to the project-level results directory (default:PROJECT_ROOT/results/), which persists across sessions, unlike the temporary sync processing files.
See Also
visual_sync
Motion-based synchronization that feeds into these indicators
Interpreting Results
Guide to understanding sync indicator images