Speaker: Andreas Griewank, Yachay Tech, Ecuador and Humboldt University, Berlin
Location: Warren Weaver Hall 101
Date: Dec. 15, 2015, 2 p.m.
The fact that even the task of minimizing a convex and thus Lipschitz continuous objective can be NP hard, suggests that we need some structure to effectively deal with nonsmooth functions. Coming from algorithmic/automatic differentiation (AD) we assume that the nonsmooth problem functions of interest are defined as compositions of smooth elementary functions and the absolute value function abs(x)=|x|, or equivalently max(x,y) and min(x,y). Rather than computing derivative vectors or matrices as in conventional AD, we can now automatically generate piecewise linear approximations that represent generalized local Taylor expansions with second order truncation error. On the basis of these models we formulate constructive necessary and sufficient conditions for local optimality. Moreover, the piecewise linear local models appended with a suitable proximal term can be used to compute downhill steps that may converge superlinearly to nondegenerate local minima. We provide numerical examples from standard test sets and machine learning applications.