Class Signature
LPTMModel class implements the Large Pre-trained Time Series Model, a versatile foundation model supporting multiple tasks including forecasting, imputation, anomaly detection, and classification.
Initialization Parameters
Model configuration dictionary. The model loads from the pre-trained checkpoint
kage08/lptm-large2 with the provided configuration.Configuration Parameters
Theconfig dictionary supports the following parameters:
Maximum patch length for the model.
Methods
finetune()
Dataset used for finetuning. Use
get_data_loader() to obtain the dataloader.Training task to perform. Options:
"forecasting": Standard forecasting task"forecasting2": Alternative forecasting mode"imputation": Fill in missing values"detection": Anomaly detection"classification": Time series classification
Learning rate for training.
Number of training epochs.
Maximum norm for gradient clipping.
Masking ratio for imputation and detection tasks.
Whether to apply quantization during training.
The trained model instance.
evaluate()
Dataset for evaluation. Use
get_data_loader() to obtain the dataloader.Evaluation task. Options:
"forecasting": Standard forecasting"forecasting2": Alternative forecasting mode"imputation": Imputation evaluation"detection": Anomaly detection evaluation"classification": Classification evaluation
If True, return only computed metrics.
For forecasting tasks (when
metric_only=True):Dictionary containing metrics:mse: Mean Squared Errormae: Mean Absolute Errormase: Mean Absolute Scaled Errormape: Mean Absolute Percentage Errorrmse: Root Mean Squared Errornrmse: Normalized RMSEsmape: Symmetric Mean Absolute Percentage Errormsis: Mean Scaled Interval Scorend: Normalized Deviation
metric_only=False (forecasting):Tuple of (metrics, trues, preds, histories):metrics: Dictionary of metrics (as above)trues: Ground truth values, shape (num_samples, num_ts, horizon_len)preds: Predictions, shape (num_samples, num_ts, horizon_len)histories: Historical context, shape (num_samples, num_ts, context_len)
plot()
Dataset for plotting.
Task to visualize (currently only supports “forecasting”).
Additional keyword arguments forwarded to visualization.
quantize()
Quantization type to apply.
Device to perform quantization on.
Quantized model instance.
Usage Example
Notes
- LPTM supports multiple tasks: forecasting, imputation, detection, and classification
- The model automatically loads from the
kage08/lptm-large2checkpoint - Mixed precision training is used automatically with gradient scaling
- OneCycleLR scheduler is applied during training