A Borst,FE Theunissen.Information theory and neural coding.Nature Neuroscience. 1999Borst A, and Theunissen F. Information theory and neural coding. Nat Neurosci 2: 947-957, 1999. 972 Boyle R, Goldberg JM, and Highstein SM. Inputs from regularly and irregularly discharging...
M. O. Ernst, M. S. Banks. Humans integrate visual and haptic information in a statistically optimal fashion.Nature, vol. 415, no. 6870, pp. 429–433, 2002. DOI:https://doi.org/10.1038/415429a. ArticleGoogle Scholar A. Borst, F. E. Theunissen. Information theory and neural coding.Nat...
Partial information decomposition as a unified approach to the specification of neural goal functions - ScienceDirect However, while the classical framework of information theory focuses on the relation between one input and one output (Shannon's mutual information), we ... Michael Wibral a,Viola ...
Victor,D Jonathan - 《Biological Theory》 被引量: 113发表: 2006年 c○ 2001 Kluwer Academic Publishers. Manufactured in The Netherlands. Information-Theoretic Analysis of Neural Coding We describe an approach to analyzing single- and multiunit (ensemble) discharge patterns based on information-theoreti...
E. Information theory and neural coding. Nature Neurosci. 2, 947– 957 (1999). Article CAS Google Scholar Usrey, W. & Reid, R. Synchronous activity in the visual system. Annu. Rev. Physiol. 61, 435– 456 (1999). Article CAS Google Scholar Zemel, R., Dayan, P. & Pouget, A. ...
Behavioral and economic theory dictate that we decide between options based on their values. However, humans and animals eagerly seek information about uncertain future rewards, even when this does not provide any objective value. This implies that decisions are made by endowing information with subje...
robotics) Social and economic aspects of machine learning (e.g., fairness, interpretability, human-AI interaction, privacy, safety, strategic behavior) Theory (e.g., control theory, learning theory, algorithmic game theory) Machine learning is a rapidly evolving field, and so we welcome interdisci...
coding theory and techniques, data compression, sequences, signal processing, detection and estimation, pattern recognition, learning and inference, communications and communication networks, complexity and cryptography, and quantum information theory and coding. Papers published in the IEEE Transactions on ...
Self-supervised learning via maximum entropy coding. Advances in Neural Information Processing Systems, 35:34091–34105, 2022. 2. Introduction Part2 翻译 然而,现有的最大熵编码框架没有明确区分来自不同增广分支的特征矩阵,阻碍了其与Alignment loss的集成。为了弥合这一差距,我们引入了矩阵信息论。通过将熵...
on the one hand by making use of an existing arsenal of methods and techniques from areas such as information theory, mathematical statistics, neural networks, nonlinear dynamics, probability theory, and statistical physics, and on the other hand by deriving new methods and techniques if required....