PAC-Bayesian theoryThe PAC-Bayesian framework is a frequentist approach to machine learning which encodes learner bias as a "prior probability" over hypotheses. This chapter reviews basic PAC-Bayesian theory, including Catoni's basic inequality and Catoni's localization theorem....
[2] VALIANT L G.A Theory of the Learnable[J].CACM,1984,27(11):1 134-1 142. [3] ARINDAM BANERJEE.On Bayesian Bounds[C].Pittsburgh,PA:Proceedings of the 23rd International Conference on Machine Learning,2006:30-32. [4] JAYANTA K GHOSH,MOHAN DELAMPADY,TAPAS SAMANTA.An Introduction to...
Computer Science - LearningThis tutorial gives a concise overview of existing PAC-Bayesian theoryfocusing on three generalization bounds. The first is an Occam bound whichhandles rules with finite precision parameters and which states thatgeneralization loss is near training loss when the number of ...
This paper provides a theoretical analysis of domain adaptation based on the PAC-Bayesian theory. We propose an improvement of the previous domain adaptation bound obtained by Germain et al. in two ways. We first give another generalization bound tighter and easier to interpret. Moreover, we ...
PAC-Bayes and Domain Adaptation We provide two main contributions in PAC-Bayesian theory for domain\nadaptation where the objective is to learn, from a source distribution, a\nwell-perfor... A Habrard,E Morvant,F Laviolette,... 被引量: 0发表: 2017年 ...
The application of PAC-Bayesian theory to Meta-RL poses a challenge due to the existence of dependencies in the training data, which renders the independent and identically distributed (i.i.d.) assumption invalid. To address this challenge, we propose a dependency graph-based offline decomposition...
Conference on Learning TheoryMcAllester, D. A. Some pac-bayesian theorems. In Proceedings of the Eleventh Annual Conference on Computational Learning Theory (Madison, WI, 1998) (1998), ACM, pp. 230- 234.David A. McAllester. Some PAC-Bayesian theorems. In Proceedings of the Eleventh Annual...
We present a comprehensive study of multilayer neural networks with binary activation, relying on the PAC-Bayesian theory. Our contributions are twofold: (i) we develop an end-to-end framework to train a binary activated deep neural network, (ii) we provide nonvacuous PAC-Bayesian generalization...
Although several methods have recently been proposed to tackle the selection problem (e.g. LEEP, H-score), these methods resort to applying heuristics that are not well motivated by learning theory. In this paper we present PACTran, a theoretically grounded family of metrics for pretrained model...
Our approach relies on tools\nfrom machine learning theory: empirical risk minimization and its convex\nrelaxations. We propose an algorithm to compute a variational approximation of\nthe pseudo-posterior. Thanks to the convex relaxation, the corresponding\nminimization problem is bi-convex, and thus...