Overview
Chronos is a family of pretrained probabilistic time series models leveraging transformer architectures from language modeling. It tokenizes time series values and trains them using a language modeling approach.
Paper: Chronos: Learning the Language of Time Series
Configuration Parameters
# When loading from repository
repo = "amazon/chronos-t5-small" # or "chronos-t5-base", "chronos-t5-large"
# For custom configuration (when not using pretrained model)
config = {
"context_length": 512,
"prediction_length": 64,
"num_samples": 20,
"temperature": 1.0,
}
Loading the Model
From Pretrained
From Config
from samay.model import ChronosModel
repo = "amazon/chronos-t5-small"
model = ChronosModel(repo=repo)
from samay.model import ChronosModel
from samay.models.chronosforecasting.chronos.chronos import ChronosConfig
model = ChronosModel(config=ChronosConfig(**config))
Chronos is available in multiple sizes: t5-tiny, t5-mini, t5-small, t5-base, and t5-large.
Loading Dataset
from samay.dataset import ChronosDataset
train_dataset = ChronosDataset(
name="ett",
datetime_col="date",
path="./data/ETTh1.csv",
mode="train",
batch_size=8
)
val_dataset = ChronosDataset(
name="ett",
datetime_col="date",
path="./data/ETTh1.csv",
mode="test",
batch_size=8
)
Zero-Shot Forecasting
metrics = model.evaluate(
val_dataset,
horizon_len=64,
quantile_levels=[0.1, 0.5, 0.9],
metric_only=True
)
Fine-tuning
model.finetune(train_dataset)
# Evaluate the finetuned model
metrics = model.evaluate(
val_dataset,
horizon_len=64,
quantile_levels=[0.1, 0.5, 0.9],
metric_only=True
)
Fine-tuning trains the underlying T5 model on your specific dataset while preserving the probabilistic forecasting capabilities.
Visualization
model.plot(
val_dataset,
horizon_len=64,
quantile_levels=[0.1, 0.5, 0.9]
)
Probabilistic Forecasting
Chronos provides probabilistic forecasts through quantiles:
quantile_forecasts, mean_forecast = model.pipeline.predict_quantiles(
context=input_seq,
prediction_length=64,
quantile_levels=[0.1, 0.5, 0.9]
)
Key Features
- Language Model Architecture: Uses T5 transformer architecture
- Tokenization: Converts time series values into tokens
- Probabilistic: Provides quantile forecasts for uncertainty estimation
- Scalable: Available in multiple model sizes
Example Notebook
For a complete working example, see: