
Sign up to save your podcasts
Or


In his 1941 short story, “Runaround,” Isaac Asimov created his three laws of robotics:
In this episode of the Plutopia podcast, Cindy Grimm and Bill Smart feel Asimov’s laws miss the mark when it comes to laws of robotics.
Bill Smart:
Asimov’s three laws, everyone brings them up. They were a plot device by a guy – who had never seen a robot – to drive story. If you were to think about it for five seconds, they are a reasonable three things to come up with. If you think about it for ten seconds, they fall apart of course. We do need structures and we do need a way of thinking about the ethics and morality of a social impact of robots. But I think it’s way more nuanced – way, way, way more nuanced – than you could compile into a paragraph of text.
Cindy Grimm:
I mean something even simple like, a robot should always do what you tell them to do, they should always get out of your way, that might not actually make sense. If the robots are all trying to get out of your way and so on, but there’s a needed delivery or they actually need to get somewhere, then maybe the robot does actually need to take priority over the humans. You can’t really have those conversations when you start from the place of, you know, all robots should kowtow to humans and stuff.
Cindy Grimm is an American computer scientist, roboticist, and mechanical engineer. Bill Smart researches the areas of robotics and machine learning. Both are professors in the College of Engineering at Oregon State University.
The post Cindy Grimm and Bill Smart: Robotics first appeared on Plutopia News Network.
By Plutopia News Network4.2
55 ratings
In his 1941 short story, “Runaround,” Isaac Asimov created his three laws of robotics:
In this episode of the Plutopia podcast, Cindy Grimm and Bill Smart feel Asimov’s laws miss the mark when it comes to laws of robotics.
Bill Smart:
Asimov’s three laws, everyone brings them up. They were a plot device by a guy – who had never seen a robot – to drive story. If you were to think about it for five seconds, they are a reasonable three things to come up with. If you think about it for ten seconds, they fall apart of course. We do need structures and we do need a way of thinking about the ethics and morality of a social impact of robots. But I think it’s way more nuanced – way, way, way more nuanced – than you could compile into a paragraph of text.
Cindy Grimm:
I mean something even simple like, a robot should always do what you tell them to do, they should always get out of your way, that might not actually make sense. If the robots are all trying to get out of your way and so on, but there’s a needed delivery or they actually need to get somewhere, then maybe the robot does actually need to take priority over the humans. You can’t really have those conversations when you start from the place of, you know, all robots should kowtow to humans and stuff.
Cindy Grimm is an American computer scientist, roboticist, and mechanical engineer. Bill Smart researches the areas of robotics and machine learning. Both are professors in the College of Engineering at Oregon State University.
The post Cindy Grimm and Bill Smart: Robotics first appeared on Plutopia News Network.

87,912 Listeners

24,832 Listeners

2,332 Listeners