From learning differential operators to learning algorithms
Data Science Seminar
Houman Owhadi
Caltech
Abstract
Most scientific and engineering challenges can be organized along a complexity ladder. Right above interpolation lies the learning of differential operators and their solution operators, an area where Gaussian Process/Kernel methods, come with rigorous guarantees and achieve SOTA in terms of data-efficiency and robustness. This talk then ascends to the ladder’s current frontier: algorithm synthesis. Here we introduce a computational‑language–processing framework that tokenizes low‑level computational actions and uses an ensemble‑based Monte‑Carlo Tree Search combined with reinforcement learning to assemble algorithms tailored to individual problem instances. We conclude by discussing where this ladder is taking us. The first part of this talk is a joint work with based on joint work with Yasamin Jalalian, Juan Felipe Osorio Ramirez, Alexander Hsu, and Bamdad Hosseini. The second part is joint work with Theo Bourdais, Abeynaya Gnanasekaran and Tuhin Sahai.
Bio
Houman Owhadi is an IBM professor of applied and computational mathematics and control and dynamical systems at the California Institute of Technology. His expertise includes uncertainty quantification, numerical approximation, statistical inference/learning, data assimilation, stochastic and multiscale analysis, and scientific machine learning. He was a plenary speaker at SIAM CSE 2015, SIAM UQ 2024 and EMI 2025, and a tutorial speaker at SIAM UQ 2016. He received the 2019 Germund Dahlquist SIAM Prize. He is a SIAM Fellow (class of 2022) and a Vannevar Busch Fellow (class of 2024).