In 2018, a prototype autonomous vehicle owned by Uber killed a bicyclist. If a human being had done such a thing, we would have had no problem assigning moral responsibility for that action. We would ask was the driver was negligent, speeding, or drunk? Or did the bicyclist dart out in front of the car giving the driver insufficient time to stop? The answers to these questions would help us decide whether the driver was responsible for the bicyclist’s death. But guilt, morality, or ethics don’t really apply to machines. Alan Rubel is an Associate Professor at the University of Wisconsin who works both in the School of Information Technology and the Law School. Alan joined Monday Buzz host Brian Standing on November 18, 2019.