Available Metrics
The metrics module (samay.metric) provides three main categories of evaluation metrics:
Forecasting Metrics
Point-based metrics for evaluating deterministic forecasts:- MSE - Mean Squared Error
- MAE - Mean Absolute Error
- RMSE - Root Mean Squared Error
- MAPE - Mean Absolute Percentage Error
- SMAPE - Symmetric Mean Absolute Percentage Error
- NRMSE - Normalized Root Mean Squared Error
- ND - Normalized Deviation
- MASE - Mean Absolute Scaled Error
Probabilistic Metrics
Metrics for evaluating probabilistic forecasts with quantile predictions:- CRPS - Continuous Ranked Probability Score
- MWSQ - Mean Weighted Squared Quantile Loss
- MSIS - Mean Scaled Interval Score
When to Use Each Metric Type
Forecasting Metrics
Use these when evaluating point forecasts (single predicted value per timestep):- MSE/RMSE - When you want to heavily penalize large errors
- MAE - When you want equal weight for all errors
- MAPE/SMAPE - When you need scale-independent percentage-based metrics
- NRMSE - When you need normalized errors for comparison across datasets
- ND - For normalized deviation metrics
- MASE - When you want to compare forecast accuracy against a baseline naive forecast
Probabilistic Metrics
Use these when evaluating probabilistic forecasts (multiple quantile predictions):- CRPS - Comprehensive evaluation of full predictive distribution
- MWSQ - Squared quantile loss for probabilistic forecasts
- MSIS - Interval-based scoring with penalties for misses
Usage Example
Import
All metrics can be imported from thesamay.metric module: