Optimal Parallel Optimization Algorithms for Machine Learning
Speaker: Blake Woodworth, Inria
Location: 60 Fifth Avenue 150
Date: April 4, 2022, 2 p.m.
Host: Joan Bruna
A key factor in the recent success of machine learning has been the use
of very large models and huge quantities of data. This has raised
computational challenges and it demands leveraging distributed
optimization algorithms to make training possible. But which parallel
algorithms should we use? In this talk, I will identify optimal
distributed optimization algorithms in several natural settings, and I
will propose directions that may allow us to find new, "better than
optimal" methods. I will also highlight some of my other research
interests at the intersection of optimization and machine learning,
including efforts to understand how highly overparametrized models like
deep neural networks manage to generalize so well.
Blake Woodworth is currently a postdoctoral researcher at Inria, working
with Francis Bach. Until September 2021, he was a Ph.D. student in
computer science at TTIC advised by Nathan Srebro. He is interested in
optimization and learning theory with the goal of developing algorithms
that make machine learning easier and more successful. Specific topics
of interest include distributed optimization, non-convex optimization,
implicit regularization, fairness in machine learning, and adaptive data
analysis. His research has been recognized with the Best Student Paper
award at COLT 2019 and the Best Paper award at COLT 2021, and his
graduate studies were supported by a NSF GRFP award and a Google
Research Ph.D. Fellowship.
In-person attendance only available to those with active NYU ID cards.