Deep neural networks are powerful tools in learning sophisticated but fixed mapping rules between inputs and outputs, thereby limiting their application in more complex and dynamic situations in which the mappin
Continual learning of context-dependent processing in neural networks,程序员大本营,技术文章内容聚合第一站。
To lift such limits, we developed an approach involving a learning algorithm, called orthogonal weights modification, with the addition of a context-dependent processing module. We demonstrated that with orthogonal weights modification to overcome catastrophic forgetting, and the context-dependent processing...
Deep neural networks are powerful tools in learning sophisticated but fixed mapping rules between inputs and outputs, thereby limiting their application in more complex and dynamic situations in which the mapping rules are not kept the same but change according to different contexts. To lift such li...
Code for paperContinual Learning of Context-dependent Processing in Neural Networks. You can also get the free version fromhttps://rdcu.be/bOaa3 There is a new version based on TF2https://github.com/xuejianyong/OWM-tf2provided by Dr. Jianyong Xue, and it basically reproduces our work. Than...
(2019). Continual learning of context-dependent processing in neural networks. Nature Machine Intelligence, 1(8), 364–372. Article Google Scholar Zenke, F., Poole, B., & Ganguli, S. (2017). Continual learning through synaptic intelligence. Proceedings of Machine Learning Research, 70, 3987...
On the contrary, the brain continually learns, accommodates, and transfers knowledge across tasks by simultaneously leveraging several neurophysiological processes, including neurogenesis, active forgetting, neuromodulation, metaplasticity, experience rehearsal, and context-dependent gating, rarely resulting in CF...
[OWM] Continual Learning of Context-dependent Processing in Neural Networks(Nature Machine Intelligence 2019)[paper][code] [LUCIR] Learning a Unified Classifier Incrementally via Rebalancing(CVPR 2019)[paper][code] [TFCL] Task-Free Continual Learning(CVPR 2019)[paper] [GD-WILD] Overcoming catastrophi...
Developing versatile and reliable memristive devices is crucial for advancing future memory and computing architectures. The years of intensive research have still not reached and demonstrated their full horizon of capabilities, and new concepts are esse
Continual learning of context-dependent processing in neural networks. Nat. Mach. Intell. 2019, 1, 364–372. [Google Scholar] [CrossRef] Zhao, H.; Fu, Y.; Kang, M.; Tian, Q.; Wu, F.; Li, X. MgSvF: Multi-Grained Slow versus Fast Framework for Few-Shot Class-Incremental ...