Representations in Robot Manipulation: Learning to Manipulate Cables, Fabrics, Bags, Liquids, and Plants
Speaker: Daniel Seita
Location: 60 Fifth Avenue Room 150
Date: March 13, 2023, 2 p.m.
Host: Lerrel Pinto
The robotics community has seen significant progress in applying machine learning for robot manipulation. However, much manipulation research focuses on rigid objects instead of highly deformable objects such as cables, fabrics, bags, liquids, and plants, which pose challenges due to their complex configuration spaces, dynamics, and self-occlusions. To achieve greater progress in robot manipulation of such diverse deformable objects, I advocate for an increased focus on learning and developing appropriate representations for robot manipulation. In this talk, I show how novel action-centric representations can lead to better imitation learning for manipulation of diverse deformable objects. I will show how such representations can be learned from color images, depth images, or point cloud observational data. My research demonstrates how novel representations can lead to an exciting new era for robot manipulation of complex objects.
Daniel Seita is a postdoctoral researcher at Carnegie Mellon University's Robotics Institute, advised by David Held. His research interests are in computer vision and machine learning for robot manipulation, with a focus on using and developing novel observation and action representations to improve manipulation of challenging deformable objects. Daniel holds a PhD in computer science from the University of California, Berkeley, advised by John Canny and Ken Goldberg. He received undergraduate degrees in math and computer science from Williams College. Daniel's research has been supported by a six-year Graduate Fellowship for STEM Diversity and by a two-year Berkeley Fellowship. He has the Honorable Mention for Best Paper award at UAI 2017, was an RSS 2022 Pioneer, and has presented his work at premier robotics conferences such as ICRA, IROS, RSS, and CoRL.
In-person attendance only available to those with active NYU ID cards.