
Sign up to save your podcasts
Or
“I think there’s a moral question that one has to ask in general about whether it’s appropriate for a machine to make a decision as to whether or not a human ought to live or die”
[Editor’s Note: As observed in TRADOC Pamphlet 525-92, The Operational Environment 2024-2034: Large-Scale Combat Operations:
“The increase in the production, employment, and success of uncrewed systems means the Army can expect to encounter these systems across the breadth and depth of LSCO.”
Contemporary conflicts in Ukraine and Middle East have witnessed the burgeoning use of autonomous weapons — empowering lesser states (i.e., Ukraine) and non-state actors (i.e., the Houthi Movement in Yemen) to conduct asymmetric strikes against nations with more robust military capabilities (i.e., Russia and Israel, respectively). These capabilities are transforming warfighting in both the air/land and land/sea littoral, eroding and possibly negating traditional concepts of air and naval superiority. The battlefield successes achieved using these autonomous technologies has led to them being rapidly proliferated around the globe, with Transnational Criminal Organizations (TCO) like the Jalisco New Generation Cartel (CJNG) effectively employing armed Unmanned Aerial Vehicles (UAVs) against both their criminal competitors and the Mexican authorities alike.
In the ongoing race to develop more effective (read lethal) combat systems capable of overcoming kinetic and electromagnetic countermeasures, some nations are integrating Artificial Intelligence (AI) and Machine Vision (MV) with Lethal Autonomous Weapons Systems (LAWS) — in essence removing human operators from within or on the OODA loop. U.S. policy on LAWS is documented in DoD Directive 3000.09, Autonomy in Weapon Systems, which includes the following statement:
“Autonomous and semi-autonomous weapon systems will be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force.”
Per the U.S. Congress’s Defense Primer: U.S. Policy on Lethal Autonomous Weapon Systems:
“U.S. policy does not prohibit the development or employment of LAWS. Although the United States is not known to currently have LAWS in its inventory, some senior military and defense leaders have stated that the United States may be compelled to develop LAWS if U.S. competitors choose to do so. At the same time, a growing number of states and nongovernmental organizations are appealing to the international community for regulation of or a ban on LAWS due to ethical concerns.”
Today’s episode of The Convergence podcast features Dr. Mark Bailey, Department Chair, Cyber Intelligence and Data Science, National Intelligence University, exploring the tension that exists between the rapid convergence of AI and battlefield autonomy and our national values requiring transparency and oversight in our use of lethal force. With this tension, there is also an associated asymmetry in ethics — our adversaries are racing ahead with their plans to harness the power of AI on the battlefield. Military thinkers within the People’s Liberation Army (PLA) embrace its prospects as a leapfrog technology that could allow China to skip technological development stages and rapidly overmatch U.S. capabilities. Russia’s Vladimir Putin proclaimed “Artificial intelligence is the future not only of Russia but of all of mankind… Whoever becomes the leader in this sphere will become the ruler of the world.” Read on to learn more about the implications of LAWS in the Operational Environment!]
Dr. Mark Bailey writes about the intersection between artificial intelligence, complexity, and national security. He is an associate professor at the National Intelligence University, where he is the Department Chair for Cyber Intelligence and Data Science, as well as the Director of the Biological and Computational Intelligence Center. His work has appeared in publications such as the journal Futures, Nautilus, and Homeland Security Today, and he was named to Homeland’s 50 Trailblazers of 2023. Previously, he worked as a data scientist on several AI programs in the U.S. Department of Defense and the Intelligence Community. He is also an Officer in the U.S. Army Reserve.
In our latest episode of The Convergence podcast, Army Mad Scientist sat down with Dr. Bailey to discuss his thoughts on AI and autonomous weapons, how their rise is impacting the U.S. Army, and how our adversaries may be poised to use them against us. The following bullet points highlight key insights from our conversation:
Stay tuned to the Mad Scientist Laboratory for our next insightful episode of The Convergence on 11 September 2025, when we sit down with Luke Miller, Director of the College of William and Mary’s Wargaming Lab, to discuss the university’s on-going wargaming projects with the DoD, his thoughts on wargame design and education in the military, and the future of wargaming.
If you enjoyed this post, check out the TRADOC Pamphlet 525-92, The Operational Environment 2024-2034: Large-Scale Combat Operations
Explore the TRADOC G-2‘s Operational Environment Enterprise web page, brimming with authoritative information on the Operational Environment and how our adversaries fight, including:
Our China Landing Zone, full of information regarding our pacing challenge, including ATP 7-100.3, Chinese Tactics, How China Fights in Large-Scale Combat Operations, BiteSize China weekly topics, and the People’s Liberation Army Ground Forces Quick Reference Guide.
Our Russia Landing Zone, including the BiteSize Russia weekly topics. If you have a CAC, you’ll be especially interested in reviewing our weekly RUS-UKR Conflict Running Estimates and associated Narratives, capturing what we learned about the contemporary Russian way of war in Ukraine over the past two years and the ramifications for U.S. Army modernization across DOTMLPF-P.
Our Iran Landing Zone, including the Iran Quick Reference Guide and the Iran Passive Defense Manual (both require a CAC to access).
Our North Korea Landing Zone, including Resources for Studying North Korea, Instruments of Chinese Military Influence in North Korea, and Instruments of Russian Military Influence in North Korea.
Our Irregular Threats Landing Zone, including TC 7-100.3, Irregular Opposing Forces, and ATP 3-37.2, Antiterrorism (requires a CAC to access).
Our Running Estimates SharePoint site (also requires a CAC to access) — documenting what we’re learning about the evolving OE. Contains our monthly OE Running Estimates, associated Narratives, and the quarterly OE Assessment TRADOC Intelligence Posts (TIPs).
Then review the following related TRADOC G-2 and Mad Scientist Laboratory content:
Adaptation… Ukraine Conflict’s UAV Evolution, by Colin Christopher
Thoughts on AI and Ethics… from the Chaplain Corps, by Dr. Nathan White
On the Ground and In the Air in Ukraine, and associated podcast with Wolfgang Hagarty
Insights from Ukraine on the Operational Environment and the Changing Character of Warfare
Learning from LSCO: Applying Lessons to Irregular Conflict, by Ian Sullivan and Kate Kilgore
Asymmetric Warfare across Multiple Domains, by Ethan Sah
Integrating Artificial Intelligence into Military Operations, by Dr. James Mancillas
“Own the Night,” as well as Former Deputy Secretary of Defense and proclaimed Mad Scientist Mr. Bob Work‘s presentation from the Disruption and the Future Operational Environment Conference on AI and Future Warfare: The Rise of the Robots (and Army Futures Command), and his Modern War Institute podcast assessing the future battlefield.
Unmanned Capabilities in Today’s Battlespace
Revolutionizing 21st Century Warfighting: UAVs and C-UAS
Death From Above! The Evolution of sUAS Technology and associated podcast, with COL Bill Edwards (USA-Ret.)
The Operational Environment’s Increased Lethality
Top Attack: Lessons Learned from the Second Nagorno-Karabakh War and associated podcast, with proclaimed Mad Scientist COL John Antal (USA-Ret.)
Jomini’s Revenge: Mass Strikes Back! by proclaimed Mad Scientist Zachery Tyson Brown
Insights from the Robotics and Autonomy Series of Virtual Events, as well as all of the associated webinar content (presenter biographies, slide decks, and notes) and associated videos
Through Soldiers’ Eyes: The Future of Ground Combat and its associated podcast
“Intelligentization” and a Chinese Vision of Future War
The PLA and UAVs – Automating the Battlefield and Enhancing Training
A Chinese Perspective on Future Urban Unmanned Operations
China: “New Concepts” in Unmanned Combat and Cyber and Electronic Warfare
The PLA: Close Combat in the Information Age and the “Blade of Victory”
“Once More unto The Breach Dear Friends”: From English Longbows to Azerbaijani Drones, Army Modernization STILL Means More than Materiel, by Ian Sullivan.
Rapid Adaptation
Turkey and the TB-2: A Rising Drone Superpower and its associated podcast, with Karen Kaya
Disclaimer: The views expressed in this blog post do not necessarily reflect those of the U.S. Department of Defense, Department of the Army, Army Futures Command (AFC), or Training and Doctrine Command (TRADOC).
4.6
4444 ratings
“I think there’s a moral question that one has to ask in general about whether it’s appropriate for a machine to make a decision as to whether or not a human ought to live or die”
[Editor’s Note: As observed in TRADOC Pamphlet 525-92, The Operational Environment 2024-2034: Large-Scale Combat Operations:
“The increase in the production, employment, and success of uncrewed systems means the Army can expect to encounter these systems across the breadth and depth of LSCO.”
Contemporary conflicts in Ukraine and Middle East have witnessed the burgeoning use of autonomous weapons — empowering lesser states (i.e., Ukraine) and non-state actors (i.e., the Houthi Movement in Yemen) to conduct asymmetric strikes against nations with more robust military capabilities (i.e., Russia and Israel, respectively). These capabilities are transforming warfighting in both the air/land and land/sea littoral, eroding and possibly negating traditional concepts of air and naval superiority. The battlefield successes achieved using these autonomous technologies has led to them being rapidly proliferated around the globe, with Transnational Criminal Organizations (TCO) like the Jalisco New Generation Cartel (CJNG) effectively employing armed Unmanned Aerial Vehicles (UAVs) against both their criminal competitors and the Mexican authorities alike.
In the ongoing race to develop more effective (read lethal) combat systems capable of overcoming kinetic and electromagnetic countermeasures, some nations are integrating Artificial Intelligence (AI) and Machine Vision (MV) with Lethal Autonomous Weapons Systems (LAWS) — in essence removing human operators from within or on the OODA loop. U.S. policy on LAWS is documented in DoD Directive 3000.09, Autonomy in Weapon Systems, which includes the following statement:
“Autonomous and semi-autonomous weapon systems will be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force.”
Per the U.S. Congress’s Defense Primer: U.S. Policy on Lethal Autonomous Weapon Systems:
“U.S. policy does not prohibit the development or employment of LAWS. Although the United States is not known to currently have LAWS in its inventory, some senior military and defense leaders have stated that the United States may be compelled to develop LAWS if U.S. competitors choose to do so. At the same time, a growing number of states and nongovernmental organizations are appealing to the international community for regulation of or a ban on LAWS due to ethical concerns.”
Today’s episode of The Convergence podcast features Dr. Mark Bailey, Department Chair, Cyber Intelligence and Data Science, National Intelligence University, exploring the tension that exists between the rapid convergence of AI and battlefield autonomy and our national values requiring transparency and oversight in our use of lethal force. With this tension, there is also an associated asymmetry in ethics — our adversaries are racing ahead with their plans to harness the power of AI on the battlefield. Military thinkers within the People’s Liberation Army (PLA) embrace its prospects as a leapfrog technology that could allow China to skip technological development stages and rapidly overmatch U.S. capabilities. Russia’s Vladimir Putin proclaimed “Artificial intelligence is the future not only of Russia but of all of mankind… Whoever becomes the leader in this sphere will become the ruler of the world.” Read on to learn more about the implications of LAWS in the Operational Environment!]
Dr. Mark Bailey writes about the intersection between artificial intelligence, complexity, and national security. He is an associate professor at the National Intelligence University, where he is the Department Chair for Cyber Intelligence and Data Science, as well as the Director of the Biological and Computational Intelligence Center. His work has appeared in publications such as the journal Futures, Nautilus, and Homeland Security Today, and he was named to Homeland’s 50 Trailblazers of 2023. Previously, he worked as a data scientist on several AI programs in the U.S. Department of Defense and the Intelligence Community. He is also an Officer in the U.S. Army Reserve.
In our latest episode of The Convergence podcast, Army Mad Scientist sat down with Dr. Bailey to discuss his thoughts on AI and autonomous weapons, how their rise is impacting the U.S. Army, and how our adversaries may be poised to use them against us. The following bullet points highlight key insights from our conversation:
Stay tuned to the Mad Scientist Laboratory for our next insightful episode of The Convergence on 11 September 2025, when we sit down with Luke Miller, Director of the College of William and Mary’s Wargaming Lab, to discuss the university’s on-going wargaming projects with the DoD, his thoughts on wargame design and education in the military, and the future of wargaming.
If you enjoyed this post, check out the TRADOC Pamphlet 525-92, The Operational Environment 2024-2034: Large-Scale Combat Operations
Explore the TRADOC G-2‘s Operational Environment Enterprise web page, brimming with authoritative information on the Operational Environment and how our adversaries fight, including:
Our China Landing Zone, full of information regarding our pacing challenge, including ATP 7-100.3, Chinese Tactics, How China Fights in Large-Scale Combat Operations, BiteSize China weekly topics, and the People’s Liberation Army Ground Forces Quick Reference Guide.
Our Russia Landing Zone, including the BiteSize Russia weekly topics. If you have a CAC, you’ll be especially interested in reviewing our weekly RUS-UKR Conflict Running Estimates and associated Narratives, capturing what we learned about the contemporary Russian way of war in Ukraine over the past two years and the ramifications for U.S. Army modernization across DOTMLPF-P.
Our Iran Landing Zone, including the Iran Quick Reference Guide and the Iran Passive Defense Manual (both require a CAC to access).
Our North Korea Landing Zone, including Resources for Studying North Korea, Instruments of Chinese Military Influence in North Korea, and Instruments of Russian Military Influence in North Korea.
Our Irregular Threats Landing Zone, including TC 7-100.3, Irregular Opposing Forces, and ATP 3-37.2, Antiterrorism (requires a CAC to access).
Our Running Estimates SharePoint site (also requires a CAC to access) — documenting what we’re learning about the evolving OE. Contains our monthly OE Running Estimates, associated Narratives, and the quarterly OE Assessment TRADOC Intelligence Posts (TIPs).
Then review the following related TRADOC G-2 and Mad Scientist Laboratory content:
Adaptation… Ukraine Conflict’s UAV Evolution, by Colin Christopher
Thoughts on AI and Ethics… from the Chaplain Corps, by Dr. Nathan White
On the Ground and In the Air in Ukraine, and associated podcast with Wolfgang Hagarty
Insights from Ukraine on the Operational Environment and the Changing Character of Warfare
Learning from LSCO: Applying Lessons to Irregular Conflict, by Ian Sullivan and Kate Kilgore
Asymmetric Warfare across Multiple Domains, by Ethan Sah
Integrating Artificial Intelligence into Military Operations, by Dr. James Mancillas
“Own the Night,” as well as Former Deputy Secretary of Defense and proclaimed Mad Scientist Mr. Bob Work‘s presentation from the Disruption and the Future Operational Environment Conference on AI and Future Warfare: The Rise of the Robots (and Army Futures Command), and his Modern War Institute podcast assessing the future battlefield.
Unmanned Capabilities in Today’s Battlespace
Revolutionizing 21st Century Warfighting: UAVs and C-UAS
Death From Above! The Evolution of sUAS Technology and associated podcast, with COL Bill Edwards (USA-Ret.)
The Operational Environment’s Increased Lethality
Top Attack: Lessons Learned from the Second Nagorno-Karabakh War and associated podcast, with proclaimed Mad Scientist COL John Antal (USA-Ret.)
Jomini’s Revenge: Mass Strikes Back! by proclaimed Mad Scientist Zachery Tyson Brown
Insights from the Robotics and Autonomy Series of Virtual Events, as well as all of the associated webinar content (presenter biographies, slide decks, and notes) and associated videos
Through Soldiers’ Eyes: The Future of Ground Combat and its associated podcast
“Intelligentization” and a Chinese Vision of Future War
The PLA and UAVs – Automating the Battlefield and Enhancing Training
A Chinese Perspective on Future Urban Unmanned Operations
China: “New Concepts” in Unmanned Combat and Cyber and Electronic Warfare
The PLA: Close Combat in the Information Age and the “Blade of Victory”
“Once More unto The Breach Dear Friends”: From English Longbows to Azerbaijani Drones, Army Modernization STILL Means More than Materiel, by Ian Sullivan.
Rapid Adaptation
Turkey and the TB-2: A Rising Drone Superpower and its associated podcast, with Karen Kaya
Disclaimer: The views expressed in this blog post do not necessarily reflect those of the U.S. Department of Defense, Department of the Army, Army Futures Command (AFC), or Training and Doctrine Command (TRADOC).
1,084 Listeners
800 Listeners
769 Listeners
701 Listeners
141 Listeners
608 Listeners
211 Listeners
415 Listeners
211 Listeners
400 Listeners
373 Listeners
21 Listeners
412 Listeners
155 Listeners
257 Listeners