Mark Schmidt
University of British Columbia
Scientific, Seminar
SCAIM Seminar: Mark Schmidt
We propose the stochastic average gradient (SAG) method for optimizing the sum of a finite number of smooth convex functions. Like stochastic gradient (SG) methods, the SAG method's iteration cost is independent of the number of terms in the sum...