Skip to main content

Overview

MOIRAI is a universal time series forecasting model that handles multiple frequencies and scales through a multi-patch architecture. It supports both zero-shot forecasting and fine-tuning. Paper: Unified Training of Universal Time Series Forecasting Transformers

Configuration Parameters

config = {
    "context_len": 128,      # Length of input context
    "horizon_len": 64,       # Forecast horizon
    "num_layers": 100,       # Number of layers (model-dependent)
    "model_type": "moirai2", # "moirai", "moirai-moe", or "moirai2"
    "model_size": "small"    # "small", "base", or "large"
}

Loading the Model

from samay.model import MoiraiTSModel

repo = "Salesforce/moirai-2.0-R-small"
model = MoiraiTSModel(repo=repo, config=config)

Loading Dataset

from samay.dataset import MoiraiDataset

train_dataset = MoiraiDataset(
    name="ett",
    mode="train",
    path="./data/ETTh1.csv",
    datetime_col="date",
    freq="h",  # Frequency: 'h' (hourly), 'D' (daily), etc.
    context_len=128,
    horizon_len=64
)

test_dataset = MoiraiDataset(
    name="ett",
    mode="test",
    path="./data/ETTh1.csv",
    datetime_col="date",
    freq="h",
    context_len=128,
    horizon_len=64
)
MOIRAI requires specifying the frequency (freq) parameter which helps the model understand the temporal granularity of your data.

Zero-Shot Forecasting

metrics, trues, preds, histories = model.evaluate(
    test_dataset,
    metrics=["MSE", "MASE"]
)
print(metrics)

Fine-tuning

MOIRAI fine-tuning requires additional configuration. See the preparation step below.
from samay.utils import read_yaml, prep_finetune_config

# Prepare fine-tuning configuration
data_config_path = "path/to/finetune/config.yaml"
ft_kwargs = prep_finetune_config(data_config_path)
ft_kwargs['max_epochs'] = 5

# Get patch size from model
if config["model_type"] == "moirai2":
    patch_size = model.model.module.patch_size
else:
    patch_size = model.model.module.in_proj.in_features_ls[0]

# Create training dataset with patch size
train_dataset = MoiraiDataset(
    name="ett",
    mode="train",
    path="./data/ETTh1.csv",
    datetime_col="date",
    freq="h",
    context_len=128,
    horizon_len=64,
    patch_size=patch_size,
    kwargs=torch_config["train_dataloader"]
)

# Fine-tune
model.finetune(train_dataset, **ft_kwargs)

Visualization

from samay.utils import visualize

visualize(
    context_len=config["context_len"],
    trues=trues,
    preds=preds,
    history=histories,
    dataset="Etth1",
    freq="h"
)

Model Variants

ModelDescriptionParameters
MOIRAI 2.0Latest version with improved architectureSmall/Base/Large
MOIRAI MoEMixture of Experts variantSmall/Base/Large
MOIRAI 1.0Original versionSmall/Base/Large

Key Features

  • Multi-frequency Support: Handles different temporal granularities
  • Multi-patch Architecture: Adapts to different input/output scales
  • Probabilistic Forecasting: Provides prediction distributions
  • Scalable: Available in multiple model sizes

Example Notebook

For a complete working example, see:

Build docs developers (and LLMs) love