This article addresses the most recent findings on Generative foundation forecasting models and gives a brief history of deep learning in time-series.
This is the connection.
This article addresses the most recent findings on Generative foundation forecasting models and gives a brief history of deep learning in time-series.
This is the connection.
MLP architectures underpin many of the top DL time series models, including NHITS, TiDE, and TSMixer. PatchTST is the primary competitive transformer; nevertheless, it might not be able to include external variables.
@Reggie True. TFT is an excellent Transformer-based approach that offers interpretability as well. My personal fave is NHITs.
However, as training and assessing these models on toy datasets is pointless, researchers currently solely employ Transformers as pretrained models.