CORE Seminar: Michael Over­ton

  • Date: 10/13/2015
  • Time: 16:00
Lecturer(s):
Michael Over­ton, New York Uni­ver­sity
Location: 

University of Washington

Topic: 

Nonsmooth, Nonconvex Optimization: Algorithms and Examples

Description: 

In many appli­ca­tions one wishes to min­i­mize an objec­tive func­tion that is not con­vex and is not dif­fer­en­tiable at its min­i­miz­ers.

 

We dis­cuss two algo­rithms for min­i­miza­tion of non­smooth, non­con­vex func­tions. Gra­di­ent Sam­pling is a sim­ple method that, although com­pu­ta­tion­ally inten­sive, has a nice convergence theory.The method is robust and the con­ver­gence the­ory has recently been extended to con­strained prob­lems. BFGS is a well known method, devel­oped for smooth prob­lems, but which is remark­ably effec­tive for non­smooth prob­lems too. Although our theo­ret­i­cal results in the non­smooth case are quite lim­ited, we have made some remark­able empir­i­cal observations and have had broad suc­cess in appli­ca­tions. Lim­ited Mem­ory BFGS is a pop­u­lar extenion for large prob­lems, and it is also applic­a­ble to the non­smooth case, although our expe­ri­ence with it is more mixed.

 

Through­out the talk we illus­trate the ideas through exam­ples, some very easy and some very chal­leng­ing. Our work is with Jim Burke (U. Wash­ing­ton) and Adrian Lewis (Cornell).

Other Information: 

Loca­tion: Raitt Hall (RAI), Room 121