
Sign up to save your podcasts
Or
Stanford’s Karen Liu is a computer scientist who works in robotics. She hopes that someday machines might take on caregiving roles, like helping medical patients get dressed and undressed each day. That quest has provided her a special insight into just what a monumental challenge such seemingly simple tasks are. After all, she points out, it takes a human child several years to learn to dress themselves — imagine what it takes to teach a robot to help a person who is frail or physically compromised?
Liu is among a growing coterie of scientists who are promoting “physics-based simulations” that are speeding up the learning process for robots. That is, rather than building actual robots and refining them as they go, she’s using computer simulations to improve how robots sense the physical world around them and to make intelligent decisions under changes and perturbations in the real world, like those involved in tasks like getting dressed for the day.
To do that, a robot must understand the physical characteristics of human flesh and bone as well as the movements and underlying human intention to be able to comprehend when a garment is or is not going on as expected.
The stakes are high. The downside consequence could be physical harm to the patient, as Liu tells Stanford Engineering’s The Future of Everything podcast hosted by bioengineer Russ Altman. Listen and subscribe here.
Connect With Us:
Episode Transcripts >>> The Future of Everything Website
Connect with Russ >>> Threads / Bluesky / Mastodon
Connect with School of Engineering >>>Twitter/X / Instagram / LinkedIn / Facebook
4.8
127127 ratings
Stanford’s Karen Liu is a computer scientist who works in robotics. She hopes that someday machines might take on caregiving roles, like helping medical patients get dressed and undressed each day. That quest has provided her a special insight into just what a monumental challenge such seemingly simple tasks are. After all, she points out, it takes a human child several years to learn to dress themselves — imagine what it takes to teach a robot to help a person who is frail or physically compromised?
Liu is among a growing coterie of scientists who are promoting “physics-based simulations” that are speeding up the learning process for robots. That is, rather than building actual robots and refining them as they go, she’s using computer simulations to improve how robots sense the physical world around them and to make intelligent decisions under changes and perturbations in the real world, like those involved in tasks like getting dressed for the day.
To do that, a robot must understand the physical characteristics of human flesh and bone as well as the movements and underlying human intention to be able to comprehend when a garment is or is not going on as expected.
The stakes are high. The downside consequence could be physical harm to the patient, as Liu tells Stanford Engineering’s The Future of Everything podcast hosted by bioengineer Russ Altman. Listen and subscribe here.
Connect With Us:
Episode Transcripts >>> The Future of Everything Website
Connect with Russ >>> Threads / Bluesky / Mastodon
Connect with School of Engineering >>>Twitter/X / Instagram / LinkedIn / Facebook
4,282 Listeners
217 Listeners
1,265 Listeners
31,919 Listeners
994 Listeners
322 Listeners
3,968 Listeners
1,450 Listeners
36 Listeners
258 Listeners
143 Listeners
101 Listeners
260 Listeners
94 Listeners
423 Listeners
44 Listeners