Same threshold of all fan-in layers: 所有扇入神经元层都应进行等量的阈值平衡,以确保输入到下一层的脉冲信息得到适当的rate-encoded。但是non-identity路径上的神经元层的脉冲阈值依赖于神经元层的前一个连接点。没有使用这个约束条件时,网络每层最大激活值的情况如下图: 这说明阈值平衡主要作用在几个初始层之...
Over the past few years, Spiking Neural Networks (SNNs) have become popular\nas a possible pathway to enable low-power event-driven neuromorphic hardware.\nHowever, their application in machine learning have largely been limited to\nvery shallow neural network architectures for simple problems. In ...
深度脉冲残差网络(Deep Spiking Residual Network) 与ResNet对比,主要改了两个地方:一是把BN换成了tdBN,而是把短连接(shortcut connection)修改了一下。 下图是ResNet-ANN和本文的deep spiking residual network在基础模块上的对比: 从上图中可以看出具体的改变在于:把ReLU激活函数换成了LIF模型;把BN层换成了tdBN...
Spiking neural networks (SNNs) are promising in a bio-plausible coding for spatio-temporal information and event-driven signal processing, which is very suited for energy-efficient implementation in neuromorphic hardware. However, the unique working mode of SNNs makes them more difficult to train than...
Going Deeper With Directly-Trained Larger Spiking Neural NetworksGuoqi LiHanle ZhengLei DengYifan HuYujie Wu
这一次因为是重新读了一遍,因此也会快一点。这篇论文描述的内容在另一位作者的文章里面已经写得很详细了,这里给出链接: weili21:tdBN—《Going Deeper With Directly-Trained Larger Spiking Neural Networks…