Generalization of Hamiltonian algorithms
07 Nov 2024, 14:30 — Room 322, UniGe DIBRIS/DIMA, Via Dodecaneso 35
Speaker:
Andreas Maurer
Andreas Maurer
Abstract:
The talk gives generalization results for a class of stochastic learning algorithms. The method applies whenever the algorithm generates an absolutely continuous distribution relative to some a-priori measure and the Radon Nikodym derivative has subgaussian concentration. Applications are bounds for the Gibbs algorithm and randomizations of stable deterministic algorithms as well as PAC-Bayesian bounds with data-dependent priors.
The talk gives generalization results for a class of stochastic learning algorithms. The method applies whenever the algorithm generates an absolutely continuous distribution relative to some a-priori measure and the Radon Nikodym derivative has subgaussian concentration. Applications are bounds for the Gibbs algorithm and randomizations of stable deterministic algorithms as well as PAC-Bayesian bounds with data-dependent priors.
Bio:
Andreas Maurer worked in machine vision, image processing and machine learning since 1983. He is an active and independent researcher in probability theory, machine learning and statistics.
Andreas Maurer worked in machine vision, image processing and machine learning since 1983. He is an active and independent researcher in probability theory, machine learning and statistics.