# MEGA DatA Lab

## Model Estimation, Graphical Algorithms, Data Analysis Lab

EECS, University of California, Irvine, CA 92697.

**Non-convex Robust PCA**

**“Non-convex Robust PCA ”** by P Netrapalli, U N Niranjan, S Sanghavi, A Anandkumar, P Jain. *To appear in NIPS 2014. *

Paper. Talk. Slides. Code.

We propose a new provable method for robust PCA, where the task is to recover a
low-rank matrix, which is corrupted with sparse perturbations. Our method consists
of simple alternating projections onto the set of low rank and sparse matrices
with intermediate denoising steps. We prove correct recovery of the low rank and
sparse components under tight recovery conditions, which match those for the
state-of-art convex relaxation techniques. Our method is extremely simple to implement
and has low computational complexity. For an $m \times n$ input matrix (say
$m \leq n$), our method has $O(r^2 m n \log(1/\epsilon))$ running time, where $r$ is the rank
of the low-rank component and $\epsilon$ is the accuracy. In contrast, convex relaxation
methods have a running time $O(m^2 n / \epsilon)$, which is not scalable to large problem
instances. Our algorithm thus provably solves the robust PCA problem in running
time that is just a factor r away from the $O (rmn \log(1/ \epsilon))$ running time of the
usual (non-robust) PCA. Our analysis represents one of the few instances of global
convergence guarantees for non-convex methods.

Sample results on the task of foreground-background separation in the restaurant video dataset:

From left to right: Original video, non-convex low-rank, non-convex sparse

From left to right: Vanilla PCA, convex low-rank, convex sparse

Note that our non-convex method is superior in the quality of the foreground-background separation (for example, notice that artifacts are not present in our method when compared against convex method), for a given accuracy, and has much faster running times (refer to the paper for more details).