Abstract
We develop theoretically guaranteed stochastic methods for outlier-robust PCA. Outlier-robust PCA seeks an underlying low-dimensional linear subspace from a dataset that is corrupted with outliers. We are able to show that our methods, which are variants of stochastic geodesic gradient descent over the Grassmannian manifold, converge and recover an underlying subspace in various regimes through the development of a novel convergence analysis. The main application of this method is an effective differentially private algorithm for outlier-robust PCA that uses a Gaussian noise mechanism within the stochastic gradient method. Our results emphasize the advantages of the nonconvex methods over another convex approach to solve Outlier-robust PCA in the differentially private setting. Experiments on synthetic and stylized data verify these results.
Original language | English (US) |
---|---|
Pages (from-to) | 173-188 |
Number of pages | 16 |
Journal | Proceedings of Machine Learning Research |
Volume | 190 |
State | Published - 2022 |
Event | 3rd Annual Conference on Mathematical and Scientific Machine Learning, MSML 2022 - Beijing, China Duration: Aug 15 2022 → Aug 17 2022 |
Bibliographical note
Publisher Copyright:© 2022 T. Maunu, C. Yu & G. Lerman.