Teaching Bots Learn by Watching Human Behavior - Ep. 67 - podcast episode cover

Teaching Bots Learn by Watching Human Behavior - Ep. 67

Aug 22, 201837 min
--:--
--:--
Listen in podcast apps:
Metacast
Spotify
Youtube
RSS

Episode description

Robots following coded instructions to complete a task? Old school. Robots learning to do things by watching how humans do it? That’s the future. Earlier this year, Stanford’s Animesh Garg and Marynel Vázquez shared their research in a talk on “Generalizable Autonomy for Robotic Mobility and Manipulation” at the GPU Technology Conference last week. We caught up with them to learn more about generalizable autonomy - the idea that a robot should be able to observe human behavior, and learn to imitate it in a way that’s applicable to a variety of tasks and situations. Like learning to cook by watching YouTube videos, or figuring out how to cross a crowded room for another.
For the best experience, listen in Metacast app for iOS or Android
Open in Metacast
Teaching Bots Learn by Watching Human Behavior - Ep. 67 | NVIDIA AI Podcast - Listen or read transcript on Metacast