Stochastic and Private Nonconvex Outlier-Robust PCA

Tyler Maunu, Chenyu Yu, Gilad Lerman

Research output: Contribution to journalConference articlepeer-review

Abstract

We develop theoretically guaranteed stochastic methods for outlier-robust PCA. Outlier-robust PCA seeks an underlying low-dimensional linear subspace from a dataset that is corrupted with outliers. We are able to show that our methods, which are variants of stochastic geodesic gradient descent over the Grassmannian manifold, converge and recover an underlying subspace in various regimes through the development of a novel convergence analysis. The main application of this method is an effective differentially private algorithm for outlier-robust PCA that uses a Gaussian noise mechanism within the stochastic gradient method. Our results emphasize the advantages of the nonconvex methods over another convex approach to solve Outlier-robust PCA in the differentially private setting. Experiments on synthetic and stylized data verify these results.

Original languageEnglish (US)
Pages (from-to)173-188
Number of pages16
JournalProceedings of Machine Learning Research
Volume190
StatePublished - 2022
Event3rd Annual Conference on Mathematical and Scientific Machine Learning, MSML 2022 - Beijing, China
Duration: Aug 15 2022Aug 17 2022

Bibliographical note

Publisher Copyright:
© 2022 T. Maunu, C. Yu & G. Lerman.

Fingerprint

Dive into the research topics of 'Stochastic and Private Nonconvex Outlier-Robust PCA'. Together they form a unique fingerprint.

Cite this