
Sign up to save your podcasts
Or
Stanford’s Karen Liu is a computer scientist who works in robotics. She hopes that someday machines might take on caregiving roles, like helping medical patients get dressed and undressed each day. That quest has provided her a special insight into just what a monumental challenge such seemingly simple tasks are. After all, she points out, it takes a human child several years to learn to dress themselves — imagine what it takes to teach a robot to help a person who is frail or physically compromised?
Liu is among a growing coterie of scientists who are promoting “physics-based simulations” that are speeding up the learning process for robots. That is, rather than building actual robots and refining them as they go, she’s using computer simulations to improve how robots sense the physical world around them and to make intelligent decisions under changes and perturbations in the real world, like those involved in tasks like getting dressed for the day.
To do that, a robot must understand the physical characteristics of human flesh and bone as well as the movements and underlying human intention to be able to comprehend when a garment is or is not going on as expected.
The stakes are high. The downside consequence could be physical harm to the patient, as Liu tells Stanford Engineering’s The Future of Everything podcast hosted by bioengineer Russ Altman. Listen and subscribe here.
Connect With Us:
Episode Transcripts >>> The Future of Everything Website
Connect with Russ >>> Threads / Bluesky / Mastodon
Connect with School of Engineering >>>Twitter/X / Instagram / LinkedIn / Facebook
4.8
127127 ratings
Stanford’s Karen Liu is a computer scientist who works in robotics. She hopes that someday machines might take on caregiving roles, like helping medical patients get dressed and undressed each day. That quest has provided her a special insight into just what a monumental challenge such seemingly simple tasks are. After all, she points out, it takes a human child several years to learn to dress themselves — imagine what it takes to teach a robot to help a person who is frail or physically compromised?
Liu is among a growing coterie of scientists who are promoting “physics-based simulations” that are speeding up the learning process for robots. That is, rather than building actual robots and refining them as they go, she’s using computer simulations to improve how robots sense the physical world around them and to make intelligent decisions under changes and perturbations in the real world, like those involved in tasks like getting dressed for the day.
To do that, a robot must understand the physical characteristics of human flesh and bone as well as the movements and underlying human intention to be able to comprehend when a garment is or is not going on as expected.
The stakes are high. The downside consequence could be physical harm to the patient, as Liu tells Stanford Engineering’s The Future of Everything podcast hosted by bioengineer Russ Altman. Listen and subscribe here.
Connect With Us:
Episode Transcripts >>> The Future of Everything Website
Connect with Russ >>> Threads / Bluesky / Mastodon
Connect with School of Engineering >>>Twitter/X / Instagram / LinkedIn / Facebook
4,275 Listeners
1,830 Listeners
1,273 Listeners
32,260 Listeners
1,032 Listeners
322 Listeners
3,995 Listeners
1,436 Listeners
38 Listeners
258 Listeners
147 Listeners
106 Listeners
262 Listeners
91 Listeners
462 Listeners
46 Listeners