Please note this event occurred in the past.
September 05, 2025 11:00 am - 12:00 pm ET
Seminars,
Mathematics of Machine Learning
LGRT 1685

Abstract

Multi-scale models provide a simple yet powerful framework for approximating and extending functions defined over grid-based or scattered datasets. In this talk, we focus on multi-scale kernel methods, where convolution with Gaussian kernels of progressively decreasing bandwidths yields a multi-scale representation. In statistics, this approach is closely related to the Nadaraya–Watson estimator. The resulting high-order approximation is constructed by iterating until the difference between the function and its approximation falls below a predefined error threshold.

We then demonstrate how this method can be extended to several applicative directions. First, we describe a modified approximation version with a local scale. Next, we show how the method can enhance the accuracy of coarse grid computations arising from a high-order finite difference scheme. This is achieved by interpolating and extending coarse grid results to the fine grid, while preserving convergence rates and reducing computation time. Finally, we present a pipeline for density-based data augmentation. It leverages the multi-scale construction to detect sparse areas in the data and generate synthetic data points. Experimental results demonstrate the effectiveness of this augmentation scheme in forecasting tasks, reducing RMSE by up to 18% and improving MAE and MASE with statistical significance.

 

Bio: Neta Rabin is a professor at School of Industrial & Intelligence Systems Engineering, Tel-Aviv University. Her research focuses on understanding complex systems using data-driven models. She is interested in manifold learning, spectral methods, multi-scale analysis and computational harmonic analysis.