Thoughts on today's learning theory - Tomaso Poggio #malgaseminar
04 Feb 2022, 14:00 — Room 706, via Dodecaneso 35, Genoa, IT The live stream will be available on 706DIMA - YouTube
Speaker:
Tomaso Poggio — MIT
Tomaso Poggio — MIT
Abstract:
I will describe a personal perspective on the current state of key problems in learning theory. In addition to CNN, several architectures with good performance have emerged, such as MLP transformers, sensors and mixers. Is there a common reason for all of them and their good performance? A natural conjecture is that these modern architectures are good for approximating, learning, and optimizing input-output mappings that can be represented by "sparse" functions that are effectively low-density. In particular, these target functions are typically compositional functions with a function graph that has nodes each with dimensionality at most k, with k << d where d is the dimensionality of the function domain.
I will describe a personal perspective on the current state of key problems in learning theory. In addition to CNN, several architectures with good performance have emerged, such as MLP transformers, sensors and mixers. Is there a common reason for all of them and their good performance? A natural conjecture is that these modern architectures are good for approximating, learning, and optimizing input-output mappings that can be represented by "sparse" functions that are effectively low-density. In particular, these target functions are typically compositional functions with a function graph that has nodes each with dimensionality at most k, with k << d where d is the dimensionality of the function domain.