Readings Newsletter
Become a Readings Member to make your shopping experience even easier.
Sign in or sign up for free!
You’re not far away from qualifying for FREE standard shipping within Australia
You’ve qualified for FREE standard shipping within Australia
The cart is loading…
Artificial intelligence and machine learning have emerged as driving forces behind transformative advancements in various fields, and have become increasingly pervasive in many industries and daily life. As these technologies continue to gain momentum, so does the need to develop a deeper understanding of their underlying principles, capabilities, and limitations. In this monograph, the authors focus on the theory of machine learning and statistical learning theory, with a particular focus on the generalization capabilities of learning algorithms.
Part I covers the foundations of information-theoretic and PAC-Bayesian generalization bounds for standard supervised learning. Part II explores the applications of generalization bounds, as well as extensions to settings beyond standard supervised learning. Several important areas of application include neural networks, federated learning and reinforcement learning. The monograph concludes with a broader discussion of information-theoretic and PAC-Bayesian generalization bounds as a whole.
This monograph will be of interest to students and researchers working in generalization and theoretical machine learning. It provides a comprehensive introduction to information-theoretic generalization bounds and their connection to PAC-Bayes, serving as a foundation from which the most recent developments are accessible.
$9.00 standard shipping within Australia
FREE standard shipping within Australia for orders over $100.00
Express & International shipping calculated at checkout
Artificial intelligence and machine learning have emerged as driving forces behind transformative advancements in various fields, and have become increasingly pervasive in many industries and daily life. As these technologies continue to gain momentum, so does the need to develop a deeper understanding of their underlying principles, capabilities, and limitations. In this monograph, the authors focus on the theory of machine learning and statistical learning theory, with a particular focus on the generalization capabilities of learning algorithms.
Part I covers the foundations of information-theoretic and PAC-Bayesian generalization bounds for standard supervised learning. Part II explores the applications of generalization bounds, as well as extensions to settings beyond standard supervised learning. Several important areas of application include neural networks, federated learning and reinforcement learning. The monograph concludes with a broader discussion of information-theoretic and PAC-Bayesian generalization bounds as a whole.
This monograph will be of interest to students and researchers working in generalization and theoretical machine learning. It provides a comprehensive introduction to information-theoretic generalization bounds and their connection to PAC-Bayes, serving as a foundation from which the most recent developments are accessible.