which provide the conceptual framework for information representation appropriate to machine-based communication. Neural-network systems (biological or artificial) do not store information or process it in the way that conventional digital computers do. Specifically, the basic unit of neural-network operati...
摘要: In this paper we introduce Neural Network Coding (NNC), a data-driven approach to joint source and network coding. In NNC, the encoders at each source and intermediate node, as well as the decoder at each destination node, are neural networks which are al...
The Bayesian brain hypothesis is one of the most influential ideas in neuroscience. However, unstated differences in how Bayesian ideas are operationalized make it difficult to draw general conclusions about how Bayesian computations map onto neural circ
We continue with a discussion of how patterns of activity evolve from one representation to another, forming dynamic representations that unfold on the underlying network. Our goal is to offer a holistic framework for understanding and describing neural information representation and transmission while ...
The Fraunhofer Neural Network Encoder/Decoder Software (NNCodec) is an efficient implementation of NNC (Neural Network Coding / ISO/IEC 15938-17 or MPEG-7 part 17), which is the first international standard on compression of neural networks. NNCodec provides an encoder and decoder with the foll...
We trained a large, deep convolutional neural network to classify the 1.2 million high-resolution images in the ImageNet LSVRC-2010 contest into the 1000 different classes. On the test data, we achieved top-1 and top-5 error rates of 37.5% and 17.0% which is considerably better than the ...
Whereas RNN shares the same weights within each layer of the network and during gradient descent, the weights and basis are adjusted individually to reduce the loss. RNN The image above is a simple representation of recurrent neural networks. If we are forecasting stock prices using simple data ...
representation and creating an invariance to small shifts and distortions. Two or three stages of convolution, non-linearity and pooling are stacked, followed by more convolutional and fully-connected layers. Backpropagating gradients through a ConvNet is as simple as through a regular deep network,...
In this work, we incorporate two key concepts from local representation alignment (LRA)12,38: (1) the use of error synapses to directly resolve the weight-transport problem, and (2) the omission of derivatives of activation functions which yield synapse rules that function like error-Hebbian up...
Another difficulty of using neural networks is that traditional neural networks are unable to explain how they are solving tasks. In some application fields like medicine this explanation is more important than the result itself. Internal result representation is often so complex that it is impossible...