Siting Liu: In-Context Learning for Differential Equations
Abstract
This talk presents In-Context Operator Networks (ICON), a novel neural-network-based framework designed to learn and apply operators directly from prompted data during inference, without requiring any weight updates. Traditional methods rely on neural networks to approximate solutions for specific equations or operators, necessitating retraining or fine-tuning for new problems. In contrast, ICON trains a single neural network to serve as a general operator learner, enabling it to adapt to new problems with minimal effort. By leveraging shared structures across operators, ICON requires only a few demonstration examples in the prompt to learn a new operator effectively. Our results demonstrate ICON's ability as a few-shot operator learner across a diverse range of differential equation problems, including forward and inverse tasks for ordinary differential equations (ODEs), partial differential equations (PDEs), and mean-field control (MFC) problems. Furthermore, we highlight ICON's generalization capabilities, showcasing its potential as a powerful tool for solving complex operator learning challenges in scientific computing and beyond. This is a joint work with Liu Yang (NUS), Tingwei Meng (UCLA), and Stanley Osher (UCLA).