Colloquium Details
Deep Learning and Language Structure
Speaker: Yoon Kim, Harvard University
Location: 60 Fifth Avenue 150
Date: February 7, 2020, 11 a.m.
Host: Michael Overton
Synopsis:
Natural language has inherent structure. Words compose with
one another to form hierarchical structures to convey meaning. These
compositional structures are ubiquitous at all levels of language.
Despite the recent, enormous success of deep neural networks in NLP,
capturing such discrete, combinatorial structure remains challenging. In
this talk, I will present two directions towards an integration of deep
learning and language structure. First, we will see how language
structure can be used as a rich source of prior knowledge to improve
language modeling and representation learning. Second, we will explore
how advances in model parameterization and inference, in particular deep
learning, can be used as a computational tool to discover linguistic
structure from raw text.
Speaker Bio:
Yoon Kim is a fifth-year PhD student at Harvard University, advised
by Alexander Rush. His research is at the intersection of natural
language processing and machine learning. He is the recipient of a
Google AI PhD Fellowship.
Notes:
In-person attendance only available to those with active NYU ID cards.