learning parity with noise原理 Learning parity with noise (LPN) is a cryptographic problem that involves learning a secret vector known as the "parity bit" even when the communication is noisy. The problem assumes a communication channel that introduces random errors, making it challenging to ...
意思是:学习与噪声等值 learning parity with noise学习与噪声等值学习与噪声等值
Pietrzak K (2012) Cryptography from learning parity with noise. In: BielikovAa˛ M, Friedrich G, Gottlob G, Katzenbeisser S & TurAa˛n G (eds) SOFSEM 2012: Theory and Practice of Computer Science, volume 7147 of Lecture Notes in Computer Science, 99-114. Springer Berlin Heidelberg....
The Learning Parity with Noise (LPN) problem is well understood in learning theory and cryptography and has been found quite useful in constructing various lightweight cryptographic primitives. There exists non-trivial evidence that the problem is robust on high-entropy secrets (and even given hard-...
Cryptography from Learning Parity with Noise The Learning Parity with Noise (LPN) problem has recently found many applications in cryptography as the hardness assumption underlying the constructions of "provably secure" cryptographic schemes like encryption or authentication protoc... K Pietrzak - ...
如果你年纪比较大,比如年纪比我大的话,你可能会知道一个困难问题,叫做Learning Parity with Noise问题。这个问题在1980年得到了些许关注,这是我能找到的最早的参考文献了,可能这个问题的提出时间要更早。密码学界对这个问题进行了一些研究,不过没研究多少。这几年,这个问题及其衍生问题有复苏的趋势。我想其中的原因是...
而有趣的是作为后量子密码主要备选方案之一的Learning-with-Error同样来自学习理论 (类似的还有 Learning...
aWe present a probabilistic private-key encryption scheme named LPN-C whose security can be reduced to the hardness of the Learning from Parity with Noise (LPN) problem. The proposed protocol involves only basic operations in GF(2) and an error-correcting code. We show that it achieves in...
& Lee, S. Noise-tolerant parity learning with one quantum bit. Phys. Rev. A 97, 032327 (2018). Ghobadi, R., Oberoi, J. S. & Zahedinejhad, E. The Power of One Qubit in Machine Learning. arXiv preprint arXiv:1905.01390 (2019). Vedaie, S. S., Noori, M., Oberoi, J. S.,...
In Pareto Frontiers in Neural Feature Learning: Data, Compute, Width, and Luck, researchers from Microsoft, Harvard, and the University of Pennsylvania explore these algorithmic intricacies and tradeoffs through the lens of a single synthetic task: the finite-sample sparse pari...