Overview
TimesFM 2.5 is the latest version of Google’s time-series foundation model, featuring enhanced architecture, better zero-shot performance, and improved fine-tuning capabilities.
Repository: google/timesfm-2.5-200m-pytorch
Configuration
TimesFM 2.5 configuration is loaded from a JSON file:
from samay.utils import load_args
arg_path = "config/timesfm_2p5.json"
args = load_args(arg_path)
Example configuration:
{
"repo": "google/timesfm-2.5-200m-pytorch",
"context_len": 512,
"horizon_len": 96,
"backend": "gpu",
"per_core_batch_size": 32
}
Loading the Model
from samay.model import TimesFM_2p5_Model
from samay.utils import load_args
arg_path = "config/timesfm_2p5.json"
args = load_args(arg_path)
model = TimesFM_2p5_Model(**args)
The model automatically compiles on first load for optimized performance.
Loading Dataset
For Training
For Evaluation
from samay.dataset import TimesFM_2p5_Dataset
train_dataset = TimesFM_2p5_Dataset(
name="ETTh1",
freq="H", # Frequency: 'H' (hourly), 'D' (daily), etc.
datetime_col="date",
path="./data/ETTh1.csv",
mode="train",
batch_size=32,
context_len=512,
horizon_len=96,
task_name="finetune"
)
val_dataset = TimesFM_2p5_Dataset(
name="ETTh1",
freq="H",
datetime_col="date",
path="./data/ETTh1.csv",
mode="test",
batch_size=128,
context_len=512,
horizon_len=96
)
Zero-Shot Forecasting
metric, trues, preds, histories, quantiles = model.evaluate(val_dataset)
print(metric)
# {'mse': ..., 'mae': ..., 'mase': ..., 'rmse': ..., 'crps': ...}
TimesFM 2.5 returns quantile forecasts for uncertainty estimation.
Fine-tuning
model.finetune(train_dataset)
# Evaluate after fine-tuning
metric, trues, preds, histories, quantiles = model.evaluate(val_dataset)
Visualization
from samay.utils import visualize
visualize(
trues=trues,
preds=preds,
history=histories,
context_len=512,
quantiles=quantiles
)
Improvements over TimesFM 1.0
- Better Zero-Shot Performance: Improved accuracy on diverse datasets
- Enhanced Architecture: More efficient transformer design
- Faster Inference: Optimized for production deployment
- Better Fine-tuning: More stable and effective adaptation
- Improved Quantiles: Better uncertainty quantification
Quantile Forecasting
TimesFM 2.5 provides probabilistic forecasts:
metric, trues, preds, histories, quantiles = model.evaluate(val_dataset)
# quantiles shape: (num_quantiles, batch_size, num_channels, horizon_len)
# Visualize with uncertainty bands
visualize(
trues=trues,
preds=preds,
history=histories,
quantiles=quantiles,
context_len=512
)
Frequency Support
TimesFM 2.5 supports various frequencies:
'S': Secondly
'T' or 'min': Minutely
'H': Hourly
'D': Daily
'W': Weekly
'M': Monthly
'Q': Quarterly
'Y': Yearly
- Model Size: 200M parameters
- Context Length: Up to 512 timesteps
- Forecast Horizon: Flexible (commonly 96-720 timesteps)
- Inference Speed: ~2-3x faster than TimesFM 1.0
When to Use TimesFM 2.5
- When you need the latest improvements from Google Research
- For production deployments requiring fast inference
- When uncertainty quantification is important
- For diverse time series with different frequencies
Example Notebook
For a complete working example, see: