Skip to content MaLGa logo MaLGa black extended MaLGa white extended UniGe ¦ MaLGa UniGe ¦ MaLGa Universita di Genova | MaLGa Universita di Genova

Spectral complexity of deep neural networks

16 Apr 2025, 13:00 — Room 322, DIBRIS/DIMA, Via Dodecaneso 35

Stefano.Vigogna - [Sorriso Mento peli sul viso]
Speaker:
Stefano Vigogna — Università di Roma Tor Vergata
Abstract:
It is well-known that randomly initialized neural networks weakly converge to Gaussian processes as the width of all layers goes to infinity. In my talk, I will propose to use the spectrum of the limiting Gaussian process kernel to characterize the complexity of the network architecture. In particular, I will define sequences of random variables associated with the spectrum, and provide a full characterization of the network complexity in terms of the asymptotic distribution of these sequences as the depth diverges. On this basis, I will classify neural networks as low-disorder, sparse, and high-disorder. I will show how this classification highlights a number of distinct features for standard activation functions, and in particular sparsity properties for the ReLU. This is joint work with Simmaco Di Lillo, Domenico Marinucci and Michele Salvi.
Bio:
Stefano Vigogna is an Associate Professor at the Department of Mathematics of the University of Rome Tor Vergata. He received his PhD in Mathematics at the University of Genova. He was a a Visiting Assistant Professor at Duke University, an Assistant Research Professor at Johns Hopkins University, and a Researcher at MaLGa, University of Genova. His research interests focus on mathematical aspects of Machine Learning.

← Back to seminars