Scientific Computation and Applied & Industrial Mathematics Seminar: Eran Treister

  • Date: 12/02/2014
  • Time: 12:30
Eran Treister, EOAS, UBC

University of British Columbia


Large-scale sparse inverse covariance estimation


The sparse inverse covariance estimation problem arises in many statistical applications in machine learning and signal processing. In this problem, the inverse of a covariance matrix of a multivariate normal distribution is estimated, assuming that it is sparse. An l-1 regularized log-determinant optimization problem is typically solved to approximate such matrices. Because of memory limitations, most existing algorithms are unable to handle large scale instances of this problem.


In this talk we present two contributions. First, we present a new block-coordinate descent (BCD) approach for solving the problem for large-scale data sets. Our method treats the sought matrix block-by-block using quadratic approximations, and we show that this approach has advantages over existing methods in several aspects. Next, we present an iterative multilevel framework for accelerating the solution of general convex optimization problems with sparsity promoting l-1 regularization. Taking advantage of the typical sparseness of the solution, we create a multilevel hierarchy of similar problems, which are traversed back and forth in order to accelerate the optimization process. We demonstrate this framework for solving the sparse inverse covariance estimation problem. Numerical experiments on both synthetic and real gene expression data sets demonstrate our BCD and multilevel approaches for solving both medium and large scale instances of this problem.


Javier Turek & Irad Yavneh, CS dept. Technion Israel Institute of Technology.

Other Information: 

Location: ESB 4133