Convex and Nonsmooth Optimization

G22.2945.002/ G63.2011.002 (Selected Topics in Numerical Analysis)

Mondays 5:10-7:00, WWH 1013

Instructor: Michael L. Overton

Course Description

Convex optimization problems have many important properties, including a powerful duality theory and the property that any local minimum is also a global minimum. Nonsmooth optimization refers to the more general problem of minimizing functions that are typically not differentiable at their minimizers. This course discusses a variety of such problems, including semidefinite programming, quadratic cone programming, eigenvalue optimization, and optimization of functions of the roots of polynomials, as well as numerical methods for solving them. Theoretical concepts to be discussed include self-concordance, subdifferential analysis and Clarke regularity. Numerical methods to be discussed include global Newton methods and primal-dual interior-point methods.


Undergraduate linear algebra and multivariable calculus

Text Book for first half of course

Other Recommended Books: NOT Required for course


  1. Jan 26: Overview of convex and nonsmooth optimization, intro to convex sets
  2. Feb 2: Convex sets (Ch 2, BV) and convex functions (Ch 3, BV)
  3. Feb 9: Convex functions, continued (Ch 3, BV) (no lecture Feb 16)
  4. Feb 23: Duality (Ch 5, BV)
  5. Mar 2: Strong Duality (Ch 5, BV)
  6. Mar 9: Gradient and Newton Methods(Ch 9, BV) (no lecture Mar 16)
  7. Mar 23: ***CANDES LECTURE 4-5, class starts 5:45 pm*** Newton Method for Self-Concordant Functions (Ch 9, BV)
  8. Mar 30: The Primal Barrier Method (Ch 11, BV)
  9. Apr 6: Primal Dual Methods for Linear and Semidefinite Programming (no lecture Apr 13)
  10. Apr 20: Subgradients and subdifferentials: convex and nonconvex functions
  11. Apr 27: Regularity, Clarke subgradients, sharp minimizers, variational analysis of spectral functions
  12. May 4: Algorithms for smooth nonconvex optimization: Newton's method, BFGS, conjugate gradient
  13. May 11: Algorithms for nonsmooth nonconvex optimization: BFGS, bundle methods, gradient sampling


Special Lectures Mar 23,24,25

Emmanuel Candes will give the prestigious Courant Lectures on convex optimization, on Mon Mar 23 at 4 p.m. (WWH 109) and Tue Mar 24 at 11:30 a.m. (WWH 102). In addition Ben Recht, a faculty candidate in computer science, will give a lecture on his work on solving challenging data analysis problems using convex optimization on Wed Mar 25 at 11:30 (WWH 1302). Please come to all these lectures if possible. On Mar 23 I will start class at 5:45 p.m. so that we can take a break after Candes' talk at the reception on the 13th floor.

Attend class and submit all homeworks. There is no final exam, although you can request an oral exam if you think your homeworks don't fairly indicate what you learned in the class.

Class Mailing List
Important: you must join the class mailing list. (Please join the list if you are planning to attend the class, whether or not you are taking it for credit.) There are two steps to joining the list; the first is to follow the instructions on the web page (including picking a password), and the second is to REPLY TO the confirmation message sent to you by the system. This list will be used for important announcements. You can also send questions or comments to this list yourself (contact me if you have questions about when this is appropriate). If you do not want to use an NYU email address, be sure to notify me in person or by email from an NYU address about your preferred address, so I can add it to my spam filter.