Please note this event occurred in the past.
October 03, 2025 11:00 am - 12:00 pm ET
Seminars,
Mathematics of Machine Learning
LGRT 1685
Abstract: Recent experimental breakthroughs have paved the way for collecting “big” neural datasets through the simultaneous recording of the activity in thousands of neurons. However, our understanding of the fundamental principles governing neural activity at the population level remains sparse and requires the establishment of appropriate theoretical frameworks. A parallel issue emerges in the context of artificial neural networks that operate efficiently via the interaction of billions of artificial neurons. Therefore, a natural question is both contexts is: what low-dimensional metrics (a.k.a., order parameters) best capture large-scale neural information processing? I will explore this question at two levels: characterizing the statistical structure of the neural representation and the dynamics of learning.
In the first part of the talk (Ref. 1), I will present a distributional version of maximum entropy modeling aimed at extracting informative collective directions in neural populations. This approach provides a principled way to reduce the number of model parameters to a range that matches the number of available samples, while directly identifying the appropriate observables to measure from the data. The resulting model is characterized by low-dimensional interactions and is closely related to latent variable models. 
In the second part of the talk (Ref. 2), I will present a framework that combines statistical physics and control theory to derive optimal training protocols in neural network models. By reducing high-dimensional dynamics to a few order parameters, we cast learning schedules as an optimal control problem, yielding nontrivial yet interpretable strategies.
 
References:
Di Carlo, Luca, et al. "Neural subspaces, minimax entropy, and mean-field theory for networks of neurons." arXiv preprint arXiv:2508.02633 (2025).
Mignacco, Francesca, and Francesco Mori. "A statistical physics framework for optimal learning." arXiv preprint arXiv:2507.07907 (2025).
 
Short bio: Francesca Mignacco is a Postdoctoral Research Fellow at the Center for the Physics of Biological Function, a joint effort between Princeton University and City University of New York. Her research lies at the crossroads of statistical physics, machine learning, and computational neuroscience. She develops principled methods and models to uncover low-dimensional structure in neural populations and investigate the mechanisms underlying neural dynamics and meta-learning.
 
Zoom link: https://umass-amherst.zoom.us/j/93564158764