Convex optimization problems have many important properties, including a powerful duality theory and the property that any local minimum is also a global minimum. Nonsmooth optimization refers to minimization of functions that are generally not convex, usually locally Lipschitz, and typically not differentiable at their minimizers. Topics in convex optimization that will be covered include duality, self-concordance and global Newton methods, primal-dual interior-point methods for conic programs, including linear programs, quadratic cone programs and semidefinite programs, and using CVX to solve convex programs in practice. Topics in nonsmooth optimization that will be covered include variational analysis, subgradients and subdifferentials, Clarke regularity, and algorithms, including gradient sampling and BFGS, for nonsmooth, nonconvex optimization. Homework will be assigned, both mathematical and computational. Students may submit a final project or take an oral final exam.
Undergraduate linear algebra and multivariable calculus
Required Text Book for first half of course