Rethinking the relationship between data and robotics
Speaker: Lerrel Pinto, Carnegie Mellon University
Location: 60 Fifth Avenue 150
Date: February 27, 2019, 2 p.m.
Host: Subhash Khot
While robotics has made tremendous progress over the last few decades, most success stories are still limited to carefully engineered and precisely modeled environments. Interestingly, one of the most significant successes in the last decade of AI has been the use of Machine Learning (ML) to generalize and robustly handle diverse situations. So why don't we just apply current learning algorithms to robots? The biggest reason is a complicated relationship between data and robotics. In other fields of AI such as computer vision, we were able to collect diverse real-world, large-scale data with lots of supervision. These three key ingredients which fueled the success of deep learning in other fields are the key bottlenecks in robotics. We do not have millions of training examples in robots; it is unclear how to supervise robots and most importantly, most of simulation/lab data is not real-world and diverse. My research has focused on rethinking the relationship between data and robotics to fuel the success of robot learning. Specifically, in this talk, I will discuss three aspects of data that will bring us closer to generalizable robotics: (a) size of data we can collect, (b) amount of supervisory signal we can extract, and (c) diversity of data we can get from robots.
Lerrel Pinto is a PhD candidate at The Robotics Institute at Carnegie Mellon University. His research interests focus on machine learning and computer vision for robots. He received an MS degree from CMU in 2016, and prior to that a B.Tech in Mechanical Engineering from IIT-Guwahati. His work on large-scale learning for grasping received the Best Student Paper award at ICRA 2016. Several of his works have been featured in popular media like TechCrunch, MIT Tech Review and BuzzFeed among others.
Refreshments will be offered starting 15 minutes prior to the scheduled start of the talk.