The AI Podcast

Teaching Bots Learn by Watching Human Behavior - Ep. 67

08.22.2018 - By NVIDIAPlay

Download our free app to listen on your phone

Download on the App StoreGet it on Google Play

Robots following coded instructions to complete a task? Old school. Robots learning to do things by watching how humans do it? That’s the future. Earlier this year, Stanford’s Animesh Garg and Marynel Vázquez shared their research in a talk on “Generalizable Autonomy for Robotic Mobility and Manipulation” at the GPU Technology Conference last week. We caught up with them to learn more about generalizable autonomy - the idea that a robot should be able to observe human behavior, and learn to imitate it in a way that’s applicable to a variety of tasks and situations. Like learning to cook by watching YouTube videos, or figuring out how to cross a crowded room for another.

More episodes from The AI Podcast