IMA Data Science Seminar: Quantile-based Iterative Methods for Corrupted Systems of Linear Equations
Data science seminars hosted by the The Institute for Mathematics and Its Applications (IMA) take place on Tuesdays from 1:25 p.m. - 2:25 p.m.
This week, Elizaveta Rebrova (University of California, Los Angeles), will be giving a lecture titled "Quantile-based Iterative Methods for Corrupted Systems of Linear Equations".
Registration is required to access the Zoom webinar.
One of the most ubiquitous problems arising across the sciences is that of solving large-scale systems of linear equations Ax = b. When it is infeasible to solve the system directly by inversion, light and scalable iterative methods can be used instead, such as, Randomized Kaczmarz (RK) algorithm, or Stochastic Gradient Descent (SGD). The classical extensions of RK/SGD to noisy (inconsistent) systems work by showing that the iterations of the method still approach the least squares solution of the system until a certain convergence horizon (that depends on the noise size). However, in order to handle large, sparse, potentially adversarial corruptions, one needs to modify the algorithms to avoid corruptions rather than try to tolerate them -- and quantiles of the residual provide a natural way to do so. In this talk, I present QuantileRK and QuantileSGD, the versions of two classical iterative algorithms aimed at linear systems with adversarially corrupted vector b. Our methods work on up to 50% of incoherent corruptions, and up to 20% of adversarial corruptions (that consistently create an "alternative" solution of the system). Our theoretical analysis shows that under some standard assumptions on the measurement model, despite corruptions of any size, both methods converge to the true solution with exactly the same rate as RK on an uncorrupted system up to an absolute constant. Based on the joint work with Jamie Haddock, Deanna Needell, and Will Swartworth.
Liza Rebrova is currently a postdoctoral scholar at the Computational Research Division of the Lawrence Berkeley National Lab. From 2018 to the end of 2020, she worked as an Assistant Adjunct Professor at the UCLA Department of Mathematics (Computational and Applied Math Group, mentored by Professors Deanna Needell and Terence Tao). She received a Ph.D. in Mathematics from the University of Michigan in 2018 (advised by Prof. Roman Vershynin) and a Specialist degree from Moscow State University in 2012. Her research involves interactions with high-dimensional probability, random matrix theory, mathematical data science, and numerical linear algebra, with the main goal to study large high-dimensional data objects in the presence of randomness and to develop randomized algorithms that efficiently process complex data. She is a recipient of the Allen Shields Memorial Fellowship (UofMichigan, 2018) and postdoctoral sponsorship by Capital Fund Management (UCLA, 2018-2020).