Pyraformer:Low-Complexity-Pyramidal-Attention-for-Long-Range-Time-Series-Modeling and Forecasting
来自ICLR 2022 (oral) 的 工作 https://openreview.net/forum?id=0EXmFzUn5I
来自ICLR 2022 (oral) 的 工作 https://openreview.net/forum?id=0EXmFzUn5I
Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting, AAAI’21 Best Paper
Attention-is-all-you-need , NIPS’17