
Sign up to save your podcasts
Or


There is a tendency (particularly in militaries) to view machines as less fallible than humans. The rapid and passionate adoption and use of AI tools in military headquarters is a notable manifestation of machine/automation bias: to operate at machine speed is viewed by many in uniform as the panacea and—according to doctrine—offers those with it a preordained right to victory. The critical lessons identified in the IDF use of 'Lavender' and 'Where's Daddy?' in Gaza from 2024, and apparently now built into the US military's MAVEN tool, have been ignored. According to Dr Elke Scharwz, militaries really need to start understanding and embracing human agency in decision-making: something that was present for millennia but is now actively being forgotten as AI tools and systems replace people. The lack of friction, debate, argument, dissension, and human discussion over targets and targeting should concern us all. As humans feel increasingly inferior to the AI tools they create, the old idea of Promethean Shame raises its head again: Elke advocates taking back control of technology instead of simply adapting ourselves to it. Per Christopher Coker "We must choose our tools carefully, not because they are inhumane (all weapons are) but because the more we come to rely on them, the more they shape our view of the world". Warrior Geeks (Hurst, 2013).
By Peter Roberts5
1111 ratings
There is a tendency (particularly in militaries) to view machines as less fallible than humans. The rapid and passionate adoption and use of AI tools in military headquarters is a notable manifestation of machine/automation bias: to operate at machine speed is viewed by many in uniform as the panacea and—according to doctrine—offers those with it a preordained right to victory. The critical lessons identified in the IDF use of 'Lavender' and 'Where's Daddy?' in Gaza from 2024, and apparently now built into the US military's MAVEN tool, have been ignored. According to Dr Elke Scharwz, militaries really need to start understanding and embracing human agency in decision-making: something that was present for millennia but is now actively being forgotten as AI tools and systems replace people. The lack of friction, debate, argument, dissension, and human discussion over targets and targeting should concern us all. As humans feel increasingly inferior to the AI tools they create, the old idea of Promethean Shame raises its head again: Elke advocates taking back control of technology instead of simply adapting ourselves to it. Per Christopher Coker "We must choose our tools carefully, not because they are inhumane (all weapons are) but because the more we come to rely on them, the more they shape our view of the world". Warrior Geeks (Hurst, 2013).

1,065 Listeners

290 Listeners

791 Listeners

147 Listeners

428 Listeners

1,415 Listeners

399 Listeners

15,506 Listeners

26 Listeners

503 Listeners

350 Listeners

23 Listeners

25 Listeners

37 Listeners

29 Listeners