Share Stay in Command
Share to email
Share to Facebook
Share to X
By John Rodsted
The podcast currently has 8 episodes available.
In this episode of Stay in Command we showcase the ICRC perspective on autonomous weapons and the need for limits. Our guest is a weapons and disarmament expert, Neil Davison from ICRC's Department of Law and Policy at Geneva HQs bringing his own insight and the ICRC's view.
Content in this Episode:
ICRC mandate and approach to weapons issues [1:14]
Humanitarian concerns of autonomous weapons [2:27]
Legal Issues [5:40]
Human Control as a core concept [8:14]
Ethical perspective [10:36]
Limits on autonomy [12:28]
International talks and 'crunch-time'[16:59]
If you have questions or concerns please contact us via [email protected]
If you want to know more look for us on Facebook, Twitter and Instagram Australia Campaign to Stop Killer Robots or use the hashtag #AusBanKillerRobots.
Become part of the movement so we Stay in Command!
For access to this and other episodes along with the full transcription and relevant links and information head to safeground.org.au/podcasts.
Our podcasts come to you from all around Australia and we would like to acknowledge the Traditional Owners throughout and their continuing connection to country, land, waters and culture.
Stock audio provided by Videvo, downloaded from www.videvo.net
In this episode of Stay in Command we discuss the diplomatic process and progress being made regarding lethal autonomous weapons systems. The episode featured Elizabeth Minor from UL-based NGO Article 36 which works to to prevent harm from weapons through stronger international standards and is on the steering committee of the Campaign to Stop Killer Robots. We look at how this issue has progressed and where it is going. We unpack key themes in the current debate and where the process must go.
Content in this episode:
Overview of this issue on the diplomatic stage to date [00:01:56]
Human Control Unpacked [00:09:05]
Human control in international debates[00:12:57]
Notion of the so-called 'entire life cycle' of the weapon [00:14:50]
False solution of proposed techno-fixes [00:18:35]
Limitations of Article 36 Weapons Reviews[00:20:50]
Debates around definition [00:25:58]
Going beyond the guiding principles[00:29:28]
Arriving at ‘Consensus recommendations’[00:30:31]
How a Legally Binding Instrument might look [00:33:26]
How we get there-diplomatic avenues[00:37:06]
The need for decisive action and leadership[00:38:42]
If you have questions or concerns please contact us via [email protected]
If you want to know more look for us on Facebook, Twitter and Instagram Australia Campaign to Stop Killer Robots or use the hashtag #AusBanKillerRobots.
Become part of the movement so we Stay in Command!
For access to this and other episodes along with the full transcription and relevant links and information head to safeground.org.au/podcasts.
Our podcasts come to you from all around Australia and we would like to acknowledge the Traditional Owners throughout and their continuing connection to country, land, waters and culture.
Stock audio provided by Videvo, downloaded from www.videvo.net
The Tech Perspective with Lizzie Silver:
Technological Aspects and Tech Industry
This episode of Stay in Command emphasised the technological dimensions and concerns as well as the implications on lethal autonomous weapons on the tech industry. Our guest Dr Lizzie Silver is a Senior Data Scientist at Melbourne-based AI company Silverpond.
Content in this episode:
Troubling reality of these weapons [1:49]
Problems with fully autonomous weapons - explainability[3:49]
Facial recognition and bias[7:11]
Military benefits from technical point of view [11:36]
Machines and the kill decision [15:01]
Hacking [16:30]
Positive uses of AI and funding battle [17:10]
Challenge of Dual Use [20:45]
Regulation: Treaty, Company Policy, Individual Actions [22:16]
If you have questions or concerns please contact us via [email protected]
If you want to know more look for us on Facebook, Twitter and Instagram Australia Campaign to Stop Killer Robots or use the hashtag #AusBanKillerRobots.
Become part of the movement so we Stay in Command!
For access to this and other episodes along with the full transcription and relevant links and information head to safeground.org.au/podcasts.
Transcript:
Welcome to SafeGround, the small organisation with big ideas working in disarmament, human security, climate change and refugees. I’m Matilda Byrne.
Thank you for tuning in to our series Stay in Command where we talk about lethal autonomous weapons, the Australian context and why we mustn’t delegate decision making from humans to machines.
This episode we’re looking at the “Tech Perspective”. We are going to discuss the technological concerns of lethal autonomous weapons and their implications on the tech industry.
And so with me today I have a great guest with me today in Dr Lizzie Silver. Lizzie is a Senior Data Scientist at Silverpond which is an AI company based in Melbourne, which is also where I am coming to you from - so welcome Lizzie, thanks so much for joining us today
Lizzie Silver[00:00:52] Thanks for having me
Matilda Byrne: Before we jump in, I’m just going to talk a bit about the definition of killer robots in case any of our listers are unfamiliar with exactly what it is we’re talking about.
So killer robots or fully autonomous weapons are weapons that have no human control over the decision making. So when they select a target and engage the target so decide to deploy lethal force on that target, there is not a human involved in that process and it is just based on AI and algorithms. So with these fully autonomous weapons there are lots of concerns that span a whole of areas that span a number of different areas - today we are going to go into technological concerns in particular because we have Lizzie and her expertise, but there's also things like moral, ethical, legal global security - a whole host of concerns really.
What is the most concerning thing about killer robots?[00:01:49]
Matilda Byrne: And what I’m interested in Lizzie, is, just to start off with if you could tell us what is it about fully autonomous weapons that you find the most worrying, so what about them makes you driven to oppose their development.
Lizzie Silver: It’s really a fundamental issue with these issues is you can't give a guarantee on how they’re going to behave. WIth humans we can’t give a guarantee on how they're going to behave but that’s why we have all these mechanisms for holding a human accountable. Now you can’t hold an algorithm accountable in any meaningful way. So what you would like to do is find a way to characterise how...
Yennie Sayle is completing her studies of a Bachelor of International Studies at RMIT University and is the Youth Engagement intern with SafeGround for the Campaign to Stop Killer Robots Australia.
She sits down with three other students from different areas of studies and experiences to talk killer robots, the level of exposure in their degrees to the topic, their views, concerns, thoughts on university involvement in their development, raising awareness among students and more.
The podcast currently has 8 episodes available.