On this page
SOTA (State of the Art) in Time Series
On this page
SOTA (State of the Art) in Time Series
Time Series Classification
Irregularly Sampled
- GRU-D
- IP-NETS, Interpolation-Prediction Networks, for Irregularly Sampled Time Series
- Transformer
- m-TAND, Multi-Time Attention Networks, for Irregularly Sampled Time Series
Regularly Sampled
- MALSTM-FCN, Multivariate LSTM-Fully Convolutional network
- FCN-SNLST (FCN-LS2T)
- GP-based, Gaussian Process (GP-LSTM, GP-GRU, GP-Conv1D)
References of Models
- MALSTM-FCN, Multivariate LSTM-FCNs for Time Series Classification, houshd/MLSTM-FCN: Multivariate LSTM Fully Convolutional Networks for Time Series Classification
- FCN-SNLST, Seq2Tens: An Efficient Representation of Sequences by Low-Rank Tensor Projections tgcsaba/seq2tens: Seq2Tens: An efficient representation of sequences by low-rank tensor projections
- GP-based, Bayesian Learning from Sequential Data using Gaussian Processes with Signature Covariances, tgcsaba/GPSig: Bayesian Learning from Sequential Data using Gaussian Processes with Signature Covariances
Time Series Forecasting
Univariate
- NLinear
- FiLM. Frequency improved Legendre Memory Model
- SCINet
Multivariate
- DLinear
- Query Selector
- Yformer
- Informer
- NLinear
- FiLM
- SCINet
References of Model
- FiLM FiLM: Frequency improved Legendre Memory Model for Long-term Time Series Forecasting tianzhou2011/FiLM
- SCINet SCINet: Time Series Modeling and Forecasting with Sample Convolution and Interaction | OpenReview cure-lab/SCINet: The GitHub repository for the paper: “Time Series is a Special Sequence: Forecasting with Sample Convolution and Interaction“. (NeurIPS 2022)
- LTSF-Linear (DLinear, NLinear) Are Transformers Effective for Time Series Forecasting? cure-lab/LTSF-Linear: This is the official implementation for AAAI-23 paper "Are Transformers Effective for Time Series Forecasting?"
- FEDformer
- Autoformer
- Informer
- Yformer
Library of Timeseries Classification and Forecasting
- timeseriesAI/tsai: Time series Timeseries Deep Learning Machine Learning Pytorch fastai | State-of-the-art Deep Learning library for Time Series and Sequences in Pytorch / fastai tsai mainly for classification
- RNN (LSTM, GRU)
- FCN
- LSTM-FCN, GRU-FCN, MLSTM-FCN
- microsoft/EdgeML: This repository provides code for machine learning algorithms for edge devices developed at Microsoft Research India. mainly for classification for edge devices
- darts for forecasting and anomaly detection
- Classic (ARIMA, VARIMA, BATS, Kalman, Random Forest)
- RNN (GRU, LSTM)
- TCN
- Dlinear, Nlinear
- eural Forecast mainly for forecasting
- RNN (LSTM)
- TCN
- Autoformer, Informer
SOTA in other Area
- OpenScience’s Bloom (opensource, 175B params),
- Google’s T5 (opensource, 11B)
- Riffusion,
- VALL-E,
- Point-E
- ChatGPT is not all you need. A State of the Art Review of large Generative AI models
Edit this page
Last updated on 3/7/2023