The geometry, combinatorics, and representation theory of neural networks

Friday, October 24, 2025 - 15:30 to 16:30

Thackeray 704

Speaker Information
Elisenda Grigsby
Boston College

Abstract or Additional Information


Deep neural networks are a class of parameterized functions that have proven remarkably successful at making predictions about unseen data from finite labeled data sets. They do so even in settings when classical intuition suggests that they ought to be overfitting (aka memorizing) the data. 

I will begin by describing the structure of neural networks and how they learn. I will then advertise one of the theoretical questions animating the field: how does the relationship between the number of parameters and the size of the data set impact the dynamics of how they learn? Along the way I will emphasize the many ways in which geometry, combinatorics, and even representation theory play a role in the field.