Rey-Bellet, Katsoulakis Will Develop More Trustworthy Machine Learning Algorithms for Predicting Extreme Events
Luc Rey-Bellet and Markos Katsoulakis, mathematics and statistics, recently were awarded a three-year, $370,000 grant from the National Science Foundation (NSF) to build new mathematical tools that will lead to more reliable and predictive models for extreme events, in particular for machine learning or artificial intelligence algorithms.
They say a central question is how much trust can users put in the predictions of a given model or algorithm, which is crucial when there are significant uncertainties associated with the nature of the model itself. The two plan to develop a systematic mathematical framework based on information theory to address the issues.
The researchers explain that phenomena driven by extreme events such as power grid and energy storage failure or catastrophic storms are notoriously difficult to understand, let alone to predict. Because of their rarity, there is usually not sufficient data available to build a model that will reliably and accurately predict new rare events, for example, the difference between a 50-year and a 100-year event.
The researchers would also like to quantify “how uncertainty in building a model or an algorithm affects our predictions,” says Rey-Bellet. They plan to build stress tests to assess the effects of uncertainties, plus bias control for safety-critical problems such as rogue ocean waves or power grid failure. They also plan to provide systematic tools to train new statistical learning models from data and to provide performance guarantees.
Applied mathematics has a key role to play to develop trustworthy algorithms and build bridges between theory, computation and applications, they point out. Katsoulakis says, “We want to make models more trustworthy. We’ll study what makes them do well and what makes them underperform using new uncertainty quantification tools.”
He adds, “The emphasis here is on extreme events and how to predict them, in the presence of data and model uncertainties. Due to tremendous advances on our computational capabilities, in recent years there’s been a lot of emphasis on creating very complex models. We need to be able to judge, are they actually predicting real extreme events or is there some fundamental error in our understanding.”
The investigators are affiliated with the campus’s NSF Transdisciplinary Research in Principles of Data Science (TRIPODS) project, a joint effort between their department and computer science.