CORE Seminar: James Rene­gar

  • Date: 01/13/2015
Lecturer(s):
James Rene­gar, Cor­nell
Location: 

University of Washington

Topic: 

Extend­ing the Applic­a­bil­ity of Effi­cient First-Order Meth­ods for Con­vex Opti­miza­tion

Description: 

The study of first-order methods has largely dominated research in continuous optimization for the last decade, yet still the range of problems for which efficient and easily-applicable first-order methods have been developed is surprisingly limited, even though much has been achieved in some areas with high profile, such as compressed sensing.

 

We present a simple transformation of any linear or semi definite (or hyper­bolic) program into an equivalent convex optimization problem whose only constraints are linear equations. The objective function is defined on the whole space, making virtually all sub gradient methods be immediately applicable. We observe, moreover, that the objective function is naturally smoothed, thereby allowing most first-order methods to be applied.

 

We develop complexity bounds in the unsmoothed case for a particular sub gradient method, and in the smoothed case for Nesterov’s original optimal first-order method for smooth functions. We achieve the desired bounds on the number of iterations.

 

Perhaps most surprising is that the transformation is simple and so is the basic theory, and yet the approach has been overlooked until now, a blind spot.