
Sign up to save your podcasts
Or


When machines can choose to kill, what happens to the rules of war? Lethal Autonomous Weapons Systems—known as LAWS—are no longer the stuff of science fiction. These AI-powered systems can identify, target, and engage enemies without direct human input, raising critical questions about accountability, ethics, and the future of warfare. On this episode of WMD, Dr. Tamara Schwartz hands off the mic to York College Cybersecurity Management majors Charlie Malone and Nate Rugh who unpack how LAWS are reshaping global defense strategies, from autonomous drones to AI-guided missile systems. Are these weapons making war faster, cleaner, and more precise—or are we entering a dangerous era where algorithms decide who lives and who dies? As militaries race to integrate autonomy into the battlefield, they ask: where should the human decision-maker fit in—and what happens when they’re left out entirely?
By Dr. Tamara Schwartz5
11 ratings
When machines can choose to kill, what happens to the rules of war? Lethal Autonomous Weapons Systems—known as LAWS—are no longer the stuff of science fiction. These AI-powered systems can identify, target, and engage enemies without direct human input, raising critical questions about accountability, ethics, and the future of warfare. On this episode of WMD, Dr. Tamara Schwartz hands off the mic to York College Cybersecurity Management majors Charlie Malone and Nate Rugh who unpack how LAWS are reshaping global defense strategies, from autonomous drones to AI-guided missile systems. Are these weapons making war faster, cleaner, and more precise—or are we entering a dangerous era where algorithms decide who lives and who dies? As militaries race to integrate autonomy into the battlefield, they ask: where should the human decision-maker fit in—and what happens when they’re left out entirely?