Colloquium Details
Collaborative, Communal, & Continual Machine Learning
Speaker: Colin Raffel
Location: 60 Fifth Avenue Room C15
Date: March 16, 2023, 2 p.m.
Host: Kyunghyun Cho
Synopsis:
Pre-trained models have become a cornerstone of machine learning thanks to the fact that they can provide improved performance with less labeled data on downstream tasks. However, these models are typically created by resource-rich research groups that unilaterally decide how a given model should be built, trained, and released, after which point it is never updated. In contrast, open-source development has demonstrated that it is possible for a community of contributors to work together to iteratively build complex and widely used software. This kind of large-scale distributed collaboration is made possible through a mature set of tools including version control and package management. In this talk, I will discuss a research focus in my group that aims to make it possible to build machine learning models in the way that open-source software is developed. Specifically, I will discuss our preliminary work on merging multiple models while retaining their individual capabilities, patching models with cheaply-communicable updates, designing modular model architectures, and tracking changes through a version control system for model parameters. I will conclude with an outlook on how the field will change once truly collaborative, communal, and continual machine learning is possible.
Speaker Bio:
Colin Raffel is an Assistant Professor at UNC Chapel Hill and a Faculty Researcher at Hugging Face. His work aims to make it easy to get computers to do new things. Consequently, he works mainly on machine learning (enabling computers to learn from examples) and natural language processing (enabling computers to communicate in natural language). He received his Ph.D. from Columbia University in 2016 and spent five years as a research scientist at Google Brain.
Notes:
In-person attendance only available to those with active NYU ID cards.