Speaker: Xin Liu, Chinese Academy of Sciences and CIMS
Location: Warren Weaver Hall 1302
Date: Oct. 28, 2016, 10 a.m.
Recently, identifying dominant eigenvalues or singular values of a sequence of closely related matrices has become an indispensable algorithmic component for many first-order optimization methods for various convex optimization problems, such as semidefinite programming, low-rank matrix completion, robust principal component analysis, sparse inverse covariance matrix estimation, nearest correlation matrix estimation, and so on. More often than not, the computation of the dominant eigenspace forms a major bottleneck in the overall efficiency of solution processes. The well-known Krylov-subspace type of methods have a few limitations including lack of scalability. Since the dominant eigenspace computation can be formulated as a special orthogonal constrained optimization problem, we propose a few optimization based approaches which perform robustly and efficiently in a wide range of scenarios. Moreover, we study the general orthogonal constrained optimization problems and propose a new algorithm framework in which either Stiefel manifold or its tangent space related calculations are waived. Numerical performance illustrates the great potential of the new algorithms based on this framework.