Skip to content MaLGa logo MaLGa black extended MaLGa white extended UniGe ¦ MaLGa UniGe ¦ MaLGa Universita di Genova | MaLGa Universita di Genova

Over-parameterization in (two-layer) neural networks: double descent, function spaces, curse of dimensionality

15 Feb 2024, 15:00 — Room 322 @ DIBRIS/DIMA, Via Dodecaneso 35, Genoa

Fanghui.Liu - [Fronte Occhiali La cura della vista]
Speaker:
Fanghui Liu — University of Warwick
Abstract:
The conventional wisdom of simple models in machine learning misses the bigger picture, especially over-parameterized neural networks (NNs), where the number of parameters are much larger than the number of training data. Our goal is to explore the mystery behind over-parameterized models from a theoretical side. In this talk, I will discuss the role of over-parameterization in machine learning from kernel methods to neural networks, to theoretically understand the separation from the perspective of function space. First, I talk about random features models from under-parameterized regime to over-parameterized regime in terms of double descent. Second, I will talk about the relationship between kernel methods and neural networks via random features in certain $F_p$ spaces, which provides a refined analysis of learning in Barron spaces beyond RKHS.
Bio:
Fanghui Liu is currently an assistant professor at University of Warwick, UK. His research interests focus on kernel methods and learning theory to build the mathematical foundations of machine learning as well as theoretical-oriented applications. For his work on learning theory and cooperation, he was chosen for AAAI New Faculty Highlights 2024, Rising Star in AI (KAUST 2023) and presented two tutorials at ICASSP 2023 and CVPR 2023. Prior to his current position, Fanghui worked as a postdoc researcher at EPFL, Switzerland and KU Leuven, Belgium, respectively. He received his PhD from Shanghai Jiao Tong University, China at 2019.

← Back to seminars