UN Treaty on Nuclear Weapons
00:00:03
Speaker
from the FLI audio files. I'm Arielle Kahn with the Future of Life Institute. This year I've had the tremendous privilege to attend discussions at the United Nations in New York, where well over a hundred delegates from around the world have been hard at work drafting the text for a treaty that will finally ban nuclear weapons. In fact, the final text of the treaty is expected to be completed by July 7th.
00:00:32
Speaker
That means that as of this summer, it would be officially illegal on an international stage for any country participating in the ban to have anything to do with nuclear weapons. However, while the UN may be in the final stages of negotiating a treaty, this was by no means a quick process, and no one expects nuclear weapons to just disappear overnight once it's signed. There will still be a lot more work before we can achieve the ultimate goal of a world free of nuclear weapons. And
UN's Role in Weapon Bans
00:00:59
Speaker
nuclear weapons aren't the only weapons the UN considers.
00:01:02
Speaker
They've banned chemical weapons, biological weapons, landmines, cluster munitions, and now they're also considering lethal autonomous weapons systems, which are systems that are especially of interest for FLI. So how does a weapon go from being one of the most feared to being banned? And then what happens once the weapon is finally banned? To discuss
Peace Campaigns and Humanitarian Disarmament
00:01:21
Speaker
these questions, Miriam Strike and Richard Moyes have both kindly joined the podcast today. Miriam is currently program's director at POCS,
00:01:29
Speaker
which is a Dutch peace organization operating in 15 conflict areas. Miriam started her work as a peace activist during the war in Bosnia and continued to support peace processes and activists in numerous conflict-affected states. She played a leading role in the campaign banning cluster munitions and developed global campaigns to prohibit financial investments in producers of cluster munitions and nuclear weapons, and led a team working on humanitarian disarmament epochs.
00:01:56
Speaker
Richard is the managing director of Article 36, which is a UK-based NGO working to prevent harm caused by certain weapons. Richard has worked closely with the international campaign to abolish nuclear weapons. He helped found the campaign to stop killer robots, and he coined the phrase meaningful human control regarding autonomous weapons. Before establishing Article 36, Richard was influential in the development of the 2008 Convention on Cluster Munitions,
00:02:22
Speaker
and his previous work has involved landmine clearance and explosive ordnance disposal operations. Richard and Miriam, thank you both for joining me today. Thanks for having us. Yeah, thanks, Ariel. So I would like to start, I think, with the nuclear negotiations that are happening now. The UN is currently working on drafting text for a treaty that will officially ban nuclear weapons.
00:02:44
Speaker
That's been a very long process, and I know you both have been involved. And I was curious if you could talk a little bit about what you're hoping will come of the treaty. We know that the United States and Russia and China probably aren't going to sign on, along with the other nuclear states. So what is the goal of the treaty? Richard, you were actually at the negotiations. Why don't we start with you?
Power Dynamics in Nuclear Weapon Politics
00:03:06
Speaker
Sure. Thanks, Ariel. I think what we're fundamentally seeing here is a process of bringing international law into alignment with sort of straightforward moral and humanitarian standards.
00:03:19
Speaker
the other weapons of mass destruction, chemical weapons and biological weapons have already been prohibited. But nuclear weapons somehow have persisted in something of a legal limbo over recent decades. And in many ways, I think this is because of political power dynamics internationally that some of the most powerful countries in the world possess nuclear weapons. Now, the relationship between that power and the possession of nuclear weapons is perhaps complicated and contentious. But somehow that configuration has
00:03:48
Speaker
essentially stopped the international legal regime from treating nuclear weapons in a straightforward manner. Yet this process really came out of rigorous consideration of the humanitarian impact that would come from the use of nuclear weapons, from the use of a single nuclear weapon that would potentially kill hundreds of thousands of people in an urban area or a populated area.
00:04:10
Speaker
up to the use of multiple nuclear weapons, which could have devastating impacts for, well, for human society and for the environment as a whole. And so look at that from the starting point of that humanitarian impact and from the sort of catastrophic humanitarian consequences, it becomes pretty straightforward. I think that by all the legal measures that have been used in the past, these weapons should be considered illegal because their effects simply cannot be
00:04:35
Speaker
contained or managed in a way that that avoids massive humanitarian suffering, massive impact on civilians and the like. And so in a way, this process is an assertion of the validity and the primacy of the legal regime and of the legal principles that have developed over the last hundred years in international law governing weapons and conflict.
00:04:54
Speaker
and saying, okay, these political dynamics may be what they are, but these legal principles need to maintain and need to be asserted regardless of that. And if some states and those states that have nuclear weapons are not prepared to
00:05:08
Speaker
sign up to that for now. We have accepted that from the beginning of this process, I think. But at the same time, it's a process that's changing the landscape against which those states continue to maintain and assert the validity of their maintenance of nuclear weapons. And by changing that legal background and, in a way, a political background that comes with that,
00:05:27
Speaker
I think we're potentially in a position to put much more pressure on those states to move towards disarmament. As a long-term agenda, I mean, I think we shouldn't have any illusions that this is going to suddenly transform into disarmament in one or more states. But progressively over time, this legal shift will make, I think, a fundamental difference to how arguments and perceptions of nuclear weapons play out, and ultimately about how political decisions get made about nuclear weapons, which is the critical issue here.
00:05:56
Speaker
I very much agree with what Richard said. And at a time when you see an erosion of international norms, as we see nowadays, I think it's quite astonishing that in, well, less than two weeks' time, we see an international treaty banning nuclear weapons.
Redefining Nuclear Weapons and Collective Security
00:06:12
Speaker
And on your question, what this treaty will bring? To me, it will first and foremost bring back the notion that we're speaking about weapons here. Like for too long, nuclear weapons were some kind of like mythical symbolic weapons.
00:06:24
Speaker
that were spoken about at the NPT and many other fora, but we never spoke about what these weapons actually do and whether we think that's illegal, but also in moral terms acceptable or not. So this treaty actually brings back the notion of what does this weapon do and do we want that? So I think that's first and second, what it also brings this treaty is a sense of collective security. So for too long, the real debate about nuclear weapons was held by only a few states.
00:06:54
Speaker
And the others basically had to wait. And so what we see now is that like more than 100 States actually take the power back and get together and make a statement and even make it into a legal treaty. Like this is, this is what we, the majority of States want. And this is how we see collective security. And it also brings perhaps as the last thing, it also brings kind of like democratization of security policy.
00:07:22
Speaker
So what you actually see now in New York is a process that was brought about by several states, but also by NGOs, by the ICRC and other actors. And especially when it comes to speaking about nuclear weapons, I think it's so important that it's actually citizens and citizens groups against speaking about nukes and whether we think that's acceptable or not. So I think it's a huge, huge establishment on various levels.
00:07:49
Speaker
Miriam, I think I want to follow up with you on this one. I know Pax especially has done a lot of work on the financial impact that a ban could have on creating nuclear weapons. Is that anything you can follow up on? Yeah, sure. It's also perhaps a bit of an answer to your question, like what will a 3D change? Now we've been involved, Richard, me as well, on an international 3D to ban custom editions. And that's basically how I started to work also on disinvestment and weapons.
00:08:20
Speaker
So we started to invest, like, what are the producers of cluster munitions? Who is investing in cluster munitions? And then we started year by year documenting this and also campaigning on this issue. And our work was supported by the cluster munitions convention because it had article one C, which said states should not assist in any way, but a development or production of use of cluster munitions. Now, over time, NGOs like Fox, but also others,
00:08:49
Speaker
but also states have repeatedly said like, well, we see, we interpreted the word assistance also as financial investment. So the norm grew that financial institutions, but also states should not
Financial Aspects of Nuclear Weapons
00:09:03
Speaker
invest in producers of cluster munitions. And we had some success. We saw the capital shrinking, but we also in the end had, for example, also a defense company saying like, well, we will stop
00:09:16
Speaker
producing cluster munitions because we're getting into trouble with financial institutions and financial investments. So as soon as we started to get involved in ICANN, the International Coalition to Abolish Nuclear Weapons, we thought we should also do the same for nuclear weapons. So we started yearly research called Dome Bank on the Bomb. So it's a yearly global report on the financing of nuclear weapons producers. And for example, the period 2013, 2016,
00:09:46
Speaker
We documented that there was more than half a trillion US dollars invested in 27 companies of nuclear weapons. And we looked at more than 500 financial institutions, so bank, pension funds, insurance companies. And we already listed 319 financial institutions investing in nuclear weapon producers. So there is huge capital involved. And it's also capital that is needed for these nuclear weapon producers and contractors.
00:10:16
Speaker
So it's a direct way to influence. And especially now when you see a lot of modernization going on with nuclear weapon arsenals and plans thereof, we think this can be a powerful tool. And we also think it's powerful because it's something that citizens, that clients can actually influence. It's not with my money, not in my name argument, that citizens and clients can make. And speaking about the treaty which is currently being negotiated in New York,
00:10:44
Speaker
In the 3D text so far, it says again, as it is with the cluster machine convention, that states should not assist. Unfortunately, there is no explicit mention of a prohibition of financial assistance, but we have heard a lot of states echoing that assistance in their idea also means financial investments. So hopefully this will also change not only the discourse a bit, but also the actual capital that goes into nuclear weapon producers.
00:11:15
Speaker
Okay, thank you. So as you mentioned, you've both worked on the cluster munitions ban and you're both now working on the nuclear weapons ban, which is hopefully about to be completed. So I'm going to want to compare how autonomous weapons are similar and different to both of those cases. But first, Richard, I was hoping you could talk a little bit about what autonomous weapons systems are.
00:11:37
Speaker
what you mean by meaningful human control. Just give us a little bit of a background and maybe also why we're worried about these types of weapons. Yeah. Thanks,
Dangers of Autonomous Weapons
00:11:47
Speaker
Ariel. I mean, I wonder if I might just backtrack a little bit at first when just because your introduction situated this conversation about autonomous weapons in the context of cluster munitions and nuclear weapons. And I think just an important thing to recognize in all of these contexts is that these weapons don't prohibit themselves.
00:12:05
Speaker
And on all of these issues, on anti-personnel minds before that, weapons are being prohibited because a diverse range of actors from civil society and from international organizations and from states have worked together to document problems and to frame problems, identify problems, and to develop the sort of legal responses. So I'm just saying that by way of introduction to where we are on autonomous weapons, which is still at a relatively
00:12:32
Speaker
I think early stage of the debate, I mean, obviously, by comparison with nuclear weapons, which have been understood to be problematic for a very long period of time now, autonomous weapons is really an issue of new and emerging technologies and the challenges that new and emerging technologies present to society, particularly when they're emerging in a military sphere and in a sphere which is essentially about how we're allowed to kill each other or how we're allowed to use technologies to kill each other.
00:13:01
Speaker
This particular issue has certain distinct challenges because of that future orientated basis to the work. But essentially autonomous weapons is a movement in technology. It's an idea in the movement of technology to a point where we will see essentially computers and machines making decisions about where to apply force, about who to kill when we're talking about people or what objects to destroy when we're talking about material.
00:13:32
Speaker
Now, choosing is in itself a slightly loaded, almost human sounding word, but essentially it's that trajectory and that movement that we're concerned about here that we're going to enter a phase of technology where essentially decision making over life or death matters and about where force is applied is essentially in the hands of computers and sensors, computers and the algorithms within those computers that are designed to make these kinds of targeting decisions.
00:14:01
Speaker
And in that movement, we're in many respects seeing a stepping back of human decision making or the sort of proximity of human decision making to the application of force. Now, already in the use of force, of course, there's distance between people applying force in certain contexts and the actual effects that are being experienced. But I think we see in the movement towards greater autonomy in weapon systems, a very dangerous sort of threshold that's being approached, whereby a significant amount of
00:14:31
Speaker
essentially moral power is potentially being invested in technology where it should rightly sit with humans. What is the extent of us using autonomous weapons today versus what we're anticipating will be designed in the future? Well, I don't want to sound dull, but it depends a lot on your definition, of course. But to go back a little bit, I think, I mean, if you look back and you see how much effort
00:14:59
Speaker
Well, how much effort we have to pay to prohibit nuclear weapons. And it would have been so much easier if they were never, of course, developed, let alone used at all. So in that sense, I see a comparison that if you look at nukes as being as the future of life quoted earlier, like the second generation of warfare, so the first generation being gunpowder, second nuclear weapons, and then the third generation is going to be a lethal autonomous weapons systems or killer robots.
00:15:29
Speaker
Then I'm still in a way a little bit of an optimist by saying that perhaps we can prevent the emergence of lethal autonomous weapons systems. At least looking at nukes, I just hope we are able to prevent it. But I also see some other similarities in the sense of lethal autonomous weapons systems like we had with nuclear weapons a few decades ago can lead to an arms race and can lead to more global insecurity.
00:15:55
Speaker
and can also lead to warfare that's way out of control. So in that sense, I see some similarities between our work on nuclear weapons and our work on killer robots. For me as a person, I sometimes find it difficult, to be honest, to work on the issue of lethal autonomous weapon systems. Because as a campaigner, I was always extra motivated by going to places where
00:16:22
Speaker
horrible weapons were used, you speak to people, you see the impact, and together with victims and survivors and others, you do the best you can to regulate or sometimes prohibit certain weapons. And as you said with killer robots, and of course, depending on the definition, but let's say we don't see them yet being used on the battlefield, which is also more difficult to be honest as a campaigner. On the other hand, the stakes are so high
00:16:51
Speaker
and warfare can get so out of control if we don't try to at least maintain a certain level of human control over certain parts of use of weapon system that it's worth doing. This actually brings up one of my questions. The way we're approaching lethal autonomous weapon systems is to try to ban them before we see these horrible humanitarian consequences.
Challenges in Banning Autonomous Weapons
00:17:17
Speaker
So how does that change your approach?
00:17:19
Speaker
Is there other precedents that we've seen where something was banned before any negative effects could occur? It definitely has an impact on approach. And I completely agree with what Miriam was just saying about the sort of experience of harm on the ground as often being a driver for work on these issues.
00:17:38
Speaker
both in terms of the evidence that supports action, but also I think in terms of individual people's motivation regarding why they're doing what they're doing. And I think the fact that this is a more future orientated debate definitely creates some different sort of
00:17:52
Speaker
dynamics in relation to that issue of to what extent there's an immediate humanitarian problem that's being addressed. But other weapon systems have been prohibited. Blinding laser weapons and the conventional conventional weapons were prohibited at a point when there was a concern that the development of laser systems designed to blind people was going to become a feature of the battlefield.
00:18:14
Speaker
In terms of autonomous weapons, I think we already see significant levels of autonomy in certain weapons systems today. And again, I agree with Miriam in terms of recognition that certain definitional issues are very important and all of this definitional issues are also politically very significant. So I don't see the lack of definitions at this point of time as a problem. Rather they create a bit more openness of the space for discussion and debate amongst different actors. So that's, that's not necessarily a, a shortcoming.
00:18:44
Speaker
But already, we have weapon technologies that rely substantially on sensors and then computers to identify targets and to respond to them. At present, they're relatively constrained in their scope of operations. Significantly, we see this on missile defense systems on ships, but also elsewhere. And those systems may have substantial autonomy within the time period within which they're being activated by a human, but they also
00:19:13
Speaker
sort of operating within a fairly clearly defined operational space. And so the implications of that autonomy are being contained by the way in which they work and the way in which human control can still be exerted over them.
00:19:26
Speaker
I mean, coming back, perhaps, I think to the last part of your question, I mean, one of the ways we've sought to orientate to this is by thinking about the concept of meaningful human control as a way of pushing the debate onto what are the elements of human control and human authority over weapons systems, decisions to use force and decisions about specifically when and where force will be applied. What are the human elements there that we feel it's important to retain because
00:19:56
Speaker
I think inevitably I'm wary of conceding any inevitability in these things, because inevitability itself is something that's contested in these discussions. But I think we are going to see more and more autonomy within military operations, in all sorts of different functions, in logistical functions, and in support functions, and in the operation of certain systems.
00:20:19
Speaker
But in certain critical functions around how targets are identified and how forces applied and over what period of time, I think in those areas we start to see potentially an erosion of a level of human essentially moral engagement that is fundamentally important to retaining some potential for humanity in the context of conflict. So for us, I think some focus on defining that positive human space is perhaps useful in a context where
00:20:48
Speaker
different technologies may take many forms in the future. And autonomy may present itself in many technological manifestations, technological forms. And so having at least some understanding, some shared understanding of what is the key human element that we want to retain, I think is probably a useful starting point. I very much agree with Richard. And that's also what makes this campaign slightly different than other disarmament campaigns
00:21:17
Speaker
This is not so much about a weapon system, but this is about how do we control warfare and how do we actually maintain the human control in the sense that it's actually a human deciding who is a legitimate target and who isn't. So it's this autonomy because autonomy, of course, is a continuum, but it's this autonomy of the critical functions of select and an attack that we would like to maintain. And this campaign
00:21:45
Speaker
as is, of course, also with the nuclear weapons campaign, is also very much a campaign for parks that's driven by ethical motivation. So we have a lot of legal arguments against lethal autonomous weapons. We have security arguments against lethal autonomous weapons, but it's mainly an ethical notion. Like, what do we as mankind want to preserve? And we believe that this kind of systems, where it's actually no longer human,
00:22:13
Speaker
Having the final say, who's a target and who isn't goes against, well, it goes against human dignity and it outsources human control to systems that we think is not only extremely dangerous, but also highly, highly unethical. So it's as a campaign, we constantly also try to speak about what do we actually want to maintain and how do you have a constructive discussion on what meaningful human control is in space and time.
00:22:43
Speaker
And if states say, well, we're not that worried about that, well, then have a look at your current systems and the current as we sometimes say precursors and figure out why do you think that there is still meaningful human control within that systems and then build up a discussion on where you want to draw the line and how we can actually draw a line, be it by a legal treaty or in other ways. You mentioned autonomous systems are on a continuum and you mentioned drawing this line.
Debate on Autonomous Weapons and Civilian Safety
00:23:13
Speaker
know one of the arguments that I've heard in favor of autonomous weapons is that they actually ideally make decisions better than humans and potentially reduce civilian casualties. I'm wondering, how do you address that argument? And is there a point on that spectrum where autonomy is beneficial? And if so, how do you know when it's crossed the line into not beneficial? Or do you prefer no autonomy in the weapon systems at all? Well, I've always been wholly critical of states, industries and other actors
00:23:43
Speaker
with a high belief in technological fixes. And I think you see that in this debate as well. So suddenly this, let's say, hypothetical weapons are able to come up with clean warfare or to prevent civilian casualties. Well, we've had that debate with other weapons systems as well, where the technological possibilities were not what they were promised to be as soon as they were used. And I also think it's a bit of an unfair debate in a sense, because it's mainly from states
00:24:13
Speaker
with high-developed industries who are most likely the ones who will be using some form of leaf autonomous weapons systems first. And as soon as you flip the question and you say like, well, what if these kind of systems will be used against your soldiers or in your country? And it's done by the argument that it's actually more precise, et cetera, et cetera, then suddenly you enter a whole different debate. And then suddenly other arguments like, for example,
00:24:42
Speaker
moral arguments come into play. So I'm really highly skeptical of people who say like, well, it could actually be beneficial, but I'm not sure if you agree, Richard. Yeah. Yeah. I do agree. I think, I think some skepticism is justifiable in this area as well. I mean, when we look back at other debates on weapons, the users of weapons who have
00:25:06
Speaker
Asserted that these weapons should not be prohibited or that they were, you know, could be used perfectly acceptably. I'm not generally taking the lead in gathering in a transparent form the data on who was actually being killed and injured by the weapons that they've used. So in the past, we've seen a bit of a deficit of accountability.
00:25:26
Speaker
and sort of responsibility taking about civilian harm, if that's the issue that we're concerned about here. I would say in certain areas that's improved over recent years and there has been a bit more of a focus on recognizing the need for casualty recording and casualty documentation, but there's definitely been some deficits there. But in a way it comes back to, I think some of these, well, there's a number of issues. I mean, there's a sort of dynamic of how we should orientate to this in general.
00:25:52
Speaker
whether we should take that sort of hypothetical assertion of sort of moral superiority of these systems. Should we take that strongly or should we adopt a more cautionary, precautionary orientation to the development of such systems? And then I think there's a sort of moral issue, which I think it's a moral issue in all of this, which I don't know if I can always articulate quite what I'm
00:26:15
Speaker
feeling on this point. But I feel like the, you know, the sort of assertions of goodies and baddies and our ability to label one from the other and to categorize people and things in society in such a accurate way is also itself somewhat illusory and something of a misunderstanding of the reality of perhaps conflict in a way in society. So I feel like any claims that we can somehow
00:26:42
Speaker
perfect violence in a way where it can be just distributed by machinery to those who deserve to receive it. And that there's no tension or moral hazard in that, I think is extremely dangerous as an underpinning concept, essentially, because in the end we're talking about embedding categorization and sort of people and things within a sort of micro bureaucracy of algorithms and labels. And in society as a whole, when we've seen
00:27:11
Speaker
that sort of bureaucratization of violence in the past over the last hundred years or so, it's generally represented an extremely negative relationship of the state to a wider population. And there's a, there's a fundamental arrogance, I think, in assuming that we can somehow code at one period of time, a structure of identification of people that can reduce them to these targets, non-targets labels quite that simplistically. So.
00:27:39
Speaker
I mean, all of this within a context where, you know, this isn't to say that some degrees of computer engagement in these functions can't be managed reasonably, but as a whole, the sort of assertions that autonomous weapons will be better than humans. I think it's a very dangerous notion. And I think perhaps also misses a point that violence in society is a human problem and it needs to continue to be messy to some extent, if we're going to recognize it as a problem and
00:28:10
Speaker
Well, not get into such a authoritarian ultimately structure that we forget what violence really means, which is a messy moral failing, I think. Okay. I'd like to bring this back to the United Nations discussion. What
UN Process for Autonomous Weapons Ban
00:28:26
Speaker
is the process right now for getting lethal autonomous weapons systems banned? Where are we? What is UN considering or talks still going on? I think the last time I checked the UN website, they were most recently updated from 2015.
00:28:40
Speaker
Yeah, I'm not sure about the website, but I think it's fair to say that when we started the international campaign to stop killer robots, and it was officially launched in London in April 2013, if I'm not mistaken. And that's immediately, and of course we build up on the knowledge and the campaign and the work that was done by others, like the International Committee on Robot Arms Control and others, but when we launched the campaign in 2013,
00:29:09
Speaker
It immediately gave a push to the international discussion, including the one on the Human Rights Council and within the conventional weapons in Geneva. And so we saw a lot of debates there in 2013 already and in 2014 and 15, where apparently the UN website stopped. But there has been some more discussions in the CCW and the last one was in April.
00:29:35
Speaker
So a lot of discussions took place, intense discussions, sometimes a whole week of lengthy debates on definitions, but also on legal problems, on ethical concerns, but it still did not lead to the start of a negotiation process. But at the last CCW meeting, it was decided that a group of governmental experts should start within the CCW to look at these type of weapons.
00:30:00
Speaker
which was uploaded by many states and also by our campaign because within the CCW framework we saw before, the GGE is kind of like the portal to regulation or a ban or let's say an additional protocol. But unfortunately, the GGE that was planned to be held in August this year in Geneva has been canceled because not every UN state or CCW high contracting partner paid their dues.
00:30:28
Speaker
So due to financial issues that GGE has been canceled in August. So we're in a bit of a silent mode right now. Well, if you look at the technological side, there is no silent mode there. It's just continuing. So it's high time that we actually have either a GGE within the CCW to discuss about practical measures to be taken.
00:30:56
Speaker
or that we have more debate in the Human Rights Council or perhaps at the debate within the UN General Assembly. So yes, it's a bit of a slow timing at the moment when it comes to killer robots in the UN. Yeah, and I mean, Mary and I have both worked around the CCW for quite a long period of time. And I think it's probably also important to recognise that the CCW works on an interpretation of consensus that means that
00:31:24
Speaker
even if it is actually having the meetings that it has agreed to have, which at present we're not, within those meetings, more or less any state can block the adoption of an agenda for more developed future work. So you're always rather at the mercy of those states that wish to progress at the slowest pace. So, I mean, it feels in this situation being frank that discussions will, I think, continue in the CCW, but
00:31:50
Speaker
really progressive work would benefit from the convening of some meetings amongst states who are interested in thinking constructively about responses to the concerns that have been raised about autonomous weapons, sort of perhaps a mixture of states that are interested in taking a progressive position and international organizations who've been expressing concerns about this and civil society. And if some meetings could be brought together of those actors,
00:32:18
Speaker
under some aegis or other, it doesn't need to be within a formal process. But at least that sort of meeting would allow for an exchange of views, working towards a sort of constructive way forward that could start to build alliances and build relationships and partnerships that in all of our previous work on weapons prohibitions have been fundamental. So in a way, I think the development of that kind of forum for discussion, not to say it has to be a set of meetings that develop their own
00:32:46
Speaker
treaty outside of the CCW, which of course has happened on other issues. But to start with just some freestanding meetings that convene progressively minded actors and start to develop a more constructive and forward-looking mode of work, I think that would at least help us when we come back into more detailed CCW discussions, start to get a bit of momentum amongst a community in that context that could shape the political landscape.
00:33:14
Speaker
I guess, from my side, that would seem to be the most the most productive next step that could be taken is in parallel to the CCW to have a sort of progressive meeting of states, international organizations and civil society and start to think in real practical terms about how responses to this issue could be could be formulated. And maybe my impatience on this issue made me sound a bit negative, but that doesn't mean that there's no progress. I mean, we have
00:33:43
Speaker
By now 19 states who called for a ban. We have more than 70 states within the CCW framework discussing this issue. And some of them came up with national policy. So it's, I mean, certain things have happened. And we know also from other treaties that you need these kinds of building blocks. I mean, you need like an international coalition as we have right now with more and more campaigns worldwide. You need actors like the ICRC who took this issue on board.
00:34:12
Speaker
You need a couple of states who actually want to take the issue forward while we're not there yet, but we see some progress there. But you also need, and I think that's very welcome when it comes to the K-Robots debate, you need active citizens and you need active academics. So I'm really pleased with the work of the Future of Life, for example, and also the work of other academics on this issue. And I think that's also perhaps
00:34:41
Speaker
a little bit different than with some of the other disarmament campaigns. We need the academics and we need people from various businesses, companies to be involved as well. And all of these building plugs somehow and hopefully sooner than later need to end up in a 3D. And a 3D which is also relevant or perhaps even several 3Ds because it should be relevant both for human rights law as well as for international humanitarian law.
00:35:09
Speaker
because we're also speaking about systems that can be used by policing. So it's a bit of a gap that we feel right now, or at least we're disappointed that the August global governmental expert meetings was cancelled, but it doesn't mean that there is no progress yet. I just wanted to jump in on Maria's comments where she name-checked the future of life in relation to this.
00:35:34
Speaker
Just thinking about this sort of community building functions of engaging scientists and roboticists and AI practitioners around these themes. And it's one of the challenges sometimes that the issues around weapons and conflict can sometimes be
00:35:47
Speaker
treated as very separated off from other parts of society. But I think it is significant that the sorts of decisions that get made about the limits essentially of a sort of AI driven decision making about life and death in the context of weapons could well have implications in the future regarding how expectations and discussions get set elsewhere.
00:36:07
Speaker
It seems like the sort of moral challenges that are being debated in that context around weapons and violence do need the engagement of people who may be concerned about these questions as they will present themselves in other parts of society in the future, because some of the same sort of moral questions will no doubt come to the fore in other contexts around, I don't know, health care or elsewhere. And as a society, we probably need to keep that overview of how these debates are playing out in different sections and
00:36:34
Speaker
One of the things that I think that we've seen from that engagement of a wider sort of AI practitioner, scientific community in this issue around weapon systems is also sort of useful bridge building exercise across these different, one of the use the word silos only because it's the word that gets used, but across these different sort of social and practitioner demarcations. We are certainly trying to do what we can to get more people involved in active.
00:37:01
Speaker
And along those lines, regarding both these autonomous weapons systems and nuclear weapons, what is it that you don't think people seem to fully appreciate that would be helpful for them to understand? I guess in general, what do you think is most important for people to understand about nuclear weapons and for autonomous weapons systems? So for me, what is important that people start to realize, and I think we're slowly getting there actually, is that we're speaking about collective security and about global security.
00:37:30
Speaker
And so it's no longer a question if it has ever been a question that if a state has, let's say, a nuclear weapon, it makes its citizens more safe. It's on the contrary. I mean, these weapons, be it nuclear weapons or be it autonomous weapon systems, will have a global impact. And my security is related to your security. So weapon systems that are used in this part will have an impact not only on other parts of the world, but also
00:38:00
Speaker
to me. So for me, both systems in that sense, go way beyond the discussion about speaking about weapon systems, it's about what kind of world and society do we want to live in. And none of these like not killer robots, not nuclear weapons are an answer to any of the threats that we face right now, be it climate change, be it if you want to use that frame terrorism. So it's, it's not an answer. It's only adding
00:38:28
Speaker
more fuel to an already dangerous world. Yeah, I think I feel very similarly, really. I mean, on nuclear weapons, it seems like over the last couple of decades, they've somehow become a very abstract, rather distant issue. And I think on that side, simple recognition of the real scale of humanitarian harm that the use of a nuclear weapon would cause is the most substantial thing. The potential for hundreds of thousands killed and injured, a sort of overwhelming effect that
00:38:58
Speaker
can't really adequately be responded to in any humanitarian terms at a minimum. I mean, that's simply from one nuclear weapon and obviously a long-term effect in terms of contamination. And the fact that that is what's really being talked about when people in our country, in the UK, when if our prime minister asserts confidently that they would be confident about pushing the button to fire nuclear weapons, they're essentially talking about incinerating hundreds of thousands of normal people.
00:39:27
Speaker
probably in a foreign country. We presume that right. But but still ultimately recognizable, normal people. And the idea that that can be approached in some way sort of glibly or confidently at all is, I think, very disturbing, especially in a context where there's been numerous near misses and accidents and incidents and nuclear weapons falling off planes and the like. And if we think that we can actually have a sustainable sense of security based on a framework of threatening each other
00:39:55
Speaker
with that level of harm and expecting that, you know, at no point will something go wrong. Then I think it's completely, it's a complete illusion essentially. And then, yeah, really on the autonomous weapons in line with Marion's point, I think it is about what sort of society do we want to live in and how much are we prepared to hand over to computers and machines and set against that. How much are we prepared to take on ourselves in terms of taking responsibility for
00:40:24
Speaker
for engagement in our societies and sort of political engagement and organizing our relationships with each other as people in a way that continues to develop our society positively rather than us simply becoming the recipients of processes handed out by algorithmic systems. And I think handing more and more violence over to such processes does not augur wealth for our societal development.
00:40:49
Speaker
Excellent. Well, thank you both so much for joining me today. It's been really informative and I have enjoyed talking with you both. Thanks, Ariel. Thank you, Ariel. To learn more, visit futureoflife.org.