Speaker: Stephen J. Wright
Location: Warren Weaver Hall 1302
Date: Dec. 7, 2015, 3:45 p.m.
There is renewed interest in elementary methods for optimization, sparked in large part by applications in data analysis. Prominent among these methods is coordinate descent, in which functions are minimized by searching successively along coordinate directions or hyperplanes. Several practical extensions of this approach have been proposed, including efficient implementations of Nesterov acceleration, randomized versions, asynchronous parallel execution, and extensions to nonsmooth objectives. New theory has improved our understanding of the convergence behavior of these methods, though numerous aspects remain unexplained. This talk will survey these recent developments, outline problem classes in which the approach is useful, and describe several open questions.