从理论上讲,泄漏的ReLU的性能应该更好,但是在实践中相对相同,因为死亡的神经元问题并不常见,可以通过其他更主流的方法来解决。 The Rectified Linear Unit, abbreviated as ReLU, has shown incredible results when abundantly used in deep neural networks. Perhaps what is shocking about this success is that it...