Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting (AAAI'21 Best Paper) - xhqing/Informer
原始Transformer的局限性 Informer模型特点 Preliminary LSTF问题的定义 编码器-解码器架构 输入表示 Methodology 有效自注意力机制(Efficient Self-attention Mechanism) 查询稀疏性度量(Query Sparsity Measurement ) 稀疏自注意力机制(ProbSparse Self-attention) ProbSparse自注意力计算过程(自我小结) Encoder Decoder Experi...
论文:AAAI2021 | Informer: Beyond efficient transformer for long sequence time-series forecasting [1] 作者:Zhou H, Zhang S, Peng J, et al. 机构:北航、UC伯克利、Rutgers大学等 录播:bilibili.com/video/BV1R 代码:github.com/zhouhaoyi/In 引用量:162 Informer是AAAI2021的最佳论文,主要是针对长时序预...
This is the origin Pytorch implementation of Informer in the following paper: Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting. Special thanks to Jieqi Peng@cookieminions for building this repo. 🚩News(Mar 27, 2023): We will release Informer V2 soon. 🚩News(Feb...
原始题目:Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting 中文翻译:Informer:超越有效变换器进行长序列时间序列预测 发表时间:2021-05-18 平台:Proceedings of the AAAI Conference on Artificial Intelligence 文章链接:https://ojs.aaai.org/index.php/AAAI/article/view/17325 开源...
2021.06.16组会汇报 Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting,程序员大本营,技术文章内容聚合第一站。
Informer:用于长序列时间序列预测的新型Transformer 技术标签:论文导读机器学习深度学习人工智能算法自然语言处理 论文标题:Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting 论文链接:https://arxiv.org/abs/2012.07436 代码链接:https://github.com/zhouhaoyi/Informer2020 论文来源:AAAI ...
[笔记] Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting 原文地址:https://arxiv.org/abs/2012.07436 源码地址:https://github.com/zhouhaoyi/Informer2020 分类: 论文笔记 好文要顶 关注我 收藏该文 微信分享 mumu_JiangZeLin 粉丝- 2 关注- 1 +加关注 0 0 升级成为...
论文标题:Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting 论文链接:https://arxiv.org/abs/2012.07436 代码链接:https://github.com/zhouhaoyi/Informer2020 论文来源:AAAI 2021 一、概述 长序列时间序列预测问题 长序列时间序列预测(Long sequence time-series forecasting,LSTF)问题...
Informer模型来自发表于AAAI21的一篇best paper《Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting》。Informer模型针对Transformer存在的一系列问题,如二次时间复杂度、高内存使用率以及Encoder-Decoder的结构限制,提出了一种新的思路来用于提高长序列的预测问题。下面的这篇文章主要带大家使...