Statistics > Machine Learning
[Submitted on 21 Oct 2021 (v1), last revised 28 Feb 2025 (this version, v6)]
Title:User-friendly introduction to PAC-Bayes bounds
View PDF HTML (experimental)Abstract:Aggregated predictors are obtained by making a set of basic predictors vote according to some weights, that is, to some probability distribution.
Randomized predictors are obtained by sampling in a set of basic predictors, according to some prescribed probability distribution.
Thus, aggregated and randomized predictors have in common that they are not defined by a minimization problem, but by a probability distribution on the set of predictors. In statistical learning theory, there is a set of tools designed to understand the generalization ability of such procedures: PAC-Bayesian or PAC-Bayes bounds.
Since the original PAC-Bayes bounds of D. McAllester, these tools have been considerably improved in many directions (we will for example describe a simplified version of the localization technique of O. Catoni that was missed by the community, and later rediscovered as "mutual information bounds"). Very recently, PAC-Bayes bounds received a considerable attention: for example there was workshop on PAC-Bayes at NIPS 2017, "(Almost) 50 Shades of Bayesian Learning: PAC-Bayesian trends and insights", organized by B. Guedj, F. Bach and P. Germain. One of the reason of this recent success is the successful application of these bounds to neural networks by G. Dziugaite and D. Roy.
An elementary introduction to PAC-Bayes theory is still missing. This is an attempt to provide such an introduction.
Submission history
From: Pierre Alquier [view email][v1] Thu, 21 Oct 2021 15:50:05 UTC (58 KB)
[v2] Wed, 27 Oct 2021 06:52:35 UTC (60 KB)
[v3] Thu, 28 Oct 2021 09:16:14 UTC (60 KB)
[v4] Tue, 9 Nov 2021 02:50:51 UTC (61 KB)
[v5] Mon, 6 Mar 2023 14:49:13 UTC (63 KB)
[v6] Fri, 28 Feb 2025 07:54:28 UTC (141 KB)
Current browse context:
stat.ML
References & Citations
Bibliographic and Citation Tools
Bibliographic Explorer (What is the Explorer?)
Connected Papers (What is Connected Papers?)
Litmaps (What is Litmaps?)
scite Smart Citations (What are Smart Citations?)
Bibliographic data
A Note on the PAC Bayesian TheoremArXiv2004
A PAC analysis of a Bayes estimatorProceedings of the Tenth Annual Conference on Computational Learning Theory, pages 2–9, New York,1997
Aggregation by exponential weighting, sharp PAC-Bayesian bounds and sparsityMachine Learning2008
A Note on Generalization Bounds for Losses with Finite Moments2024 IEEE International Symposium on Information Theory (ISIT)2024
A note on regularised NTK dynamics with an application to PAC-Bayesian trainingTrans. Mach. Learn. Res.2023
A PAC-Bayesian Framework for Optimal Control with Stability Guarantees2024 IEEE 63rd Conference on Decision and Control (CDC)2024
Bayesian Inference for Misspecified Generative ModelsAnnual Review of Statistics and Its Application2023
publications
15
supporting
0
mentioning
0
contrasting
0
15
0
0
0
Citing PublicationsSupportingMentioningContrasting
See how this article has been cited at scite.ai
scite shows how a scientific paper has been cited by providing the context of the citation, a classification describing whether it supports, mentions, or contrasts the cited claim, and a label indicating in which section the citation was made.