## Scientific Computing and Applied & Industrial Mathematics Seminar: Julie Nutini

- Date: 04/05/2016

University of British Columbia

Is Greedy Coordinate Descent a Terrible Algorithm?

There has been significant recent work on the theory and application of randomized coordinate descent algorithms, beginning with the work of Nesterov, who showed that a random-coordinate selection rule achieves the same convergence rate as the Gauss-Southwell selection rule. This result suggests that we should never use the Gauss-Southwell rule, as it is typically much more expensive than random selection. However, the empirical behaviours of these algorithms contradict this theoretical result: in applications where the computational costs of the selection rules are comparable, the Gauss-Southwell selection rule tends to perform substantially better than random coordinate selection. We give a simple analysis of the Gauss-Southwell rule showing that---except in extreme cases---it's convergence rate is faster than choosing random coordinates. Further, we (i) show that exact coordinate optimization improves the convergence rate for certain sparse problems, (ii) propose a Gauss-Southwell-Lipschitz rule that gives an even faster convergence rate given knowledge of the Lipschitz constants of the partial derivatives, and (iii) analyze proximal-gradient variants of the Gauss-Southwell rule.

Joint work with Mark Schmidt, Issam Laradji, Michael Friedlander and Hoyt Koepke.

Location: ESB 4133 (PIMS Lounge)