Nonsmooth, Nonconvex Optimization
- Date: 10/15/2007
Michael Overton (New York University)
Simon Fraser University
There are many algorithms for minimization when the objective function
is differentiable or convex, but few options when it is neither. We
describe two simple algorithmic approaches for minimization of
nonsmooth, nonconvex objectives: BFGS (a new look at an old method),
and Gradient Sampling (a method that, although computationally
intensive, has a nice convergence theory, which we will describe). Both
methods require the user to provide a routine that computes the
function value and, if it exists, the gradient at a given point, the
idea being that the function is virtually never evaluated at a point
where it is not differentiable, even though it is typically not
differentiable at local optimizers. These methods are combined in a new
publicly available code, HANSO (Hybrid Algorithm for NonSmooth
Optimization).
Applications abound in engineering, particularly control. Of particular
interest to me are those involving eigenvalues and singular values of
nonsymmetric matrices. Sometimes even easily stated problems in a few
variables are hard. Another publicly available code HIFOO (H-Infinity
Fixed-Order Optimization) is intended for use by practising control
engineers and has solved some open theoretical and practical problems
in control.
Part of the IAM Seminar Series