github链接:GitHub - BICLab/Attention-SNN: Offical implementation of "Attention Spiking Neural Networks" (IEEE T-PAMI2023) 导读 脉冲神经网络(SNN)与人工神经网络(ANN)之间的性能差距是影响SNN普及的重大障碍,许多现实世界的平台都有资源和电池的限制。为了充分发挥SNN的潜力,作者研究了注意力机制,提出在SNN中使...
We adopt the MS-SNN (https://github.com/Ariande1/MS-ResNet) as the residual spiking neural network backbone. Download [ImageNet Dataset] and set the downloaded dataset path in utils.py. then run the tasks in /Att_Res_SNN. eg:
TCJA-SNN: Temporal-Channel Joint Attention for Spiking Neural Networks [TNNLS 2024] How to Run First clone the repository. git clone https://github.com/ridgerchu/TCJA cd TCJA pip install -r requirements.txt Train DVS128 Detailed usage of the script could be found in the source file. python...
Spiking Transformers, which integrate Spiking Neural Networks (SNNs) with Transformer architectures, have attracted significant attention due to their potential for energy efficiency and high performance. However, existing models in this domain still suffer from suboptimal performance. We introduce several ...
Spiking neural networks (SNNs) offer a bio-plausible and potentially power-efficient alternative to conventional deep learning. Although there has been pro... I Chakraborty,A Agrawal,A Jaiswal,... - 《Philosophical Transactions》 被引量: 0发表: 2020年 加载更多来源会议 2024 IEEE International Confe...
This is the official repository for paper Tensor Decomposition Based Attention Module for Spiking Neural Networkspaper:[pdf]CIFAR10/CIFAR100How to Run1. Prepare environmentA Docker environment is strongly recommended, but you can also use pip to prepare the environment. If you are not familiar ...
Gated Attention Coding for Training High-performance and Efficient Spiking Neural Networks (AAAI24) Xuerui Qiu, Rui-Jie Zhu, Yuhong Chou,Zhaorui Wang, Liang-Jian Deng, Guoqi Li Institute of Automation, Chinese Academy of Sciences University of Electronic Science and Technology of China University of...
Spike-Transformer: "Spike Transformer: Monocular Depth Estimation for Spiking Camera", ECCV, 2022 (Peking University). [Paper][PyTorch] GLPanoDepth: "GLPanoDepth: Global-to-Local Panoramic Depth Estimation", arXiv, 2022 (Nanjing University). [Paper] DepthFormer: "DepthFormer: Exploiting Long-Ran...
Spiking Transformers, which integrate Spiking Neural Networks (SNNs) with Transformer architectures, have attracted significant attention due to their potential for low energy consumption and high performance. However, there remains a substantial gap in performance between SNNs and Artificial Neural Networks...
NAR-Former-V2: "NAR-Former V2: Rethinking Transformer for Universal Neural Network Representation Learning", arXiv, 2023 (Intellifusion, China). [Paper] AutoST: "AutoST: Training-free Neural Architecture Search for Spiking Transformers", arXiv, 2023 (NC State). [Paper] TurboViT: "TurboViT: ...