Episode Summary

This episodes explores how time and space should be considered as means for maintaining meaningful human control. The key questions that this episode answers are:

1) What systems are considered unacceptable that would need to be prohibited?

2) How is meaningful human control ensured over the systems that do not need to be prohibited?

Episode Notes

This episode was recorded in March 2021.

Transcript

Uldduz Sohrabi: Hi and welcome to Article 36 Autonomous Weapons Podcast series, where we raise a critical voice on the weapons. My name is Uldduz Sohrabi and I’m your host in this series. With me in today’s episode is the managing director at Article 36, Richard Moyes, along with one of our main advisors, Elizabeth Minor.

[00:00:41] If you are new to Article 36, and you would like to get to know our team better, you can find out more about us and the work we do on our website ARTICLE36.ORG.. Now our goal through these podcast series is to bring a better understanding of concerns surrounding autonomous weapons systems and to explain the position of Article 36 and its policy framework.

[00:01:06] So far in these discussions, we took a broad view of the subject matter, looking at weapons that use sensors to determine when and where to apply force. The key questions that we want to answer are first, what systems do we consider unacceptable that would need to be prohibited? And second, how do we ensure meaningful human control over the systems that do not need to be prohibited? And we’re going to talk in particular about the importance of limiting the location and duration over which a system can function.

[00:01:45] Richard, could you start by explaining why it’s important to think about the time and space in which, a system functions as means for understanding human control.

Richard Moyes: [00:01:56] Thanks Uldduz. I think for us, the starting point here is systems that use sensors and where sensor inputs subject to a sort of machine calculation, and then force is applied and a human user has not set the specific location of where and when that force will occur. So we’re talking about systems, basically using sensors to determine where and when  force will be applied.

[00:02:18] Now within that space, we think that certain systems are unacceptable where opposed to systems that  use sensors to target people in this way. And we think there should be a prohibition on systems that just simply can’t be effectively controlled because they can’t be effectively  understood and the like.

[00:02:32] However that does then leave  set of technologies including some weapons that we have already today that function through this basic mode, but which people are not considering to be unacceptable per se, we don’t think you could say they just, they’re just immoral or they’re just straightforwardly unacceptable and therefore  they should be prohibited, but they still raise concerns because when you have this envelope of time and space, you have some uncertainty for the user about specifically where, and specifically when force is going to occur. And so that uncertainty it only grows if a system is allowed to function over a longer period of time or over a wider area of space. The longer- the time. The wider- the space- the less straightforward it is for the user of that system to be able to really understand that context, to have meaningful information about that context. The context can change as a result of circumstances changing over time.

I would almost go so far as to say it’s a fact that greater time and greater space of functioning makes it less predictable specifically  what’s going to happen. So we really need to find ways to apply some pressure on those parameters.

[00:03:40] They’re not the only parameters required to exert control, but we need to apply some pressure on those parameters in order to make sure that the users of systems are making meaningful  judgements about the context in which those systems are going to be used. And these time and space parameters are really key characteristics for framing that.

Elizabeth Minor: [00:04:00] Yeah. And time and space, I think for us in this context is important in the context of the use of the systems, right? So not in kind of the design of technologies , but when they’re actually being used by a commander and to our minds in a specific sort of legal concept of an individual attack. I think these factors of time and space they’re already important and used in thinking about making legal judgment in the deployment of weapon systems and thinking about the time and space over which a system will operate and the uncertainties that will be generated as part of making effective, legal and ethical decisions in the use of weapon systems.

Uldduz Sohrabi: [00:04:41] So we already have certain systems that use  sensors to apply force in the way that we’re talking about here. How do these examples illustrate some of the issues, perhaps in terms of more control or less control?

Richard Moyes: [00:04:54] It is challenge for systems that work in this way to ensure that the sort of boundaries of space and time that they use within are controlled effectively. When we look at landmines going back to one of the most basic systems that uses sensors to determine  at least when to apply force, we have clearly seen significant problems because of the fact that there is an open-ended potential duration of system functioning from many of those  weapons.

[00:05:22] And therefore they’ve carried on functioning after wars have finished, after peace agreements have been signed, but the weapon systems  duration in  time is carrying on being active. And that’s continued to  cause casualties subsequently. We see also missile defense systems that are used on board ships and in other locations to use radar signatures to identify incoming  missiles and I think we would say with those systems, there is a capacity to control the spatial sort of area of their use and to control the duration of their use in a way that can allow for meaningful human control. The commander can know where we’re focusing the system on this particular area of sky, for example.

[00:06:04] And they can say we’re going to turn it on now and we’re going to monitor the situation to ensure that there’s not civilian air traffic coming into the area and we can turn the system off if we detect any kind of risks to our assumptions here. So  those systems generally use that kind of human control  over their  spatial focus and their duration of functioning  to manage risks, to make sure they’re being used effectively.

[00:06:30]Finally,  we see artillery systems  are functioning today that use sensors to try to detect, say, armored fighting vehicles, tanks, these kinds of things. Those weapon systems don’t tend to function for a very long duration- they don’t loiter for very long over a target area perhaps only a matter of seconds, really.

[00:06:48] And they don’t cover a huge spatial area. So if they potentially allow a commander to know that this sensing function, this target detecting function,  that’s going to determine specifically where the force is applied, it’s only going to occur within a relatively bounded area and for a relatively short period of time.

[00:07:07] And so the user of the weapon can be making a judgment about the risks that arise from that functioning. I think you could say if that same system were to be able to function over a much wider and wider area and over a longer period of time, the very same technical sensing function might suddenly become uncontrollable or uncontrolled because there would be all sorts of other things that might trigger its activation, trigger false positives and the like. The armoured fighting vehicles even might be next to some other civilian objects that you couldn’t foresee but you would have been able to foresee if you were focused in a smaller area.

[00:07:42] So this is just to say that systems that in their sort of technical functioning of sensors and calculations and target profiling, they might not be unacceptable to us per se, but if expanded over a longer period of time and over a wider area, they could become distinctly problematic.

Elizabeth Minor: [00:08:00] Yeah. There’s  quite a lot of systems that exist at the moment which have this process of functioning that we’re talking about should be within our scope of regulation in autonomy and weapon systems.

[00:08:10]So systems that after they’ve been activated or in placed by a human user, then process data that comes into their sensors in order to activate force on a particular target objects. And I suppose with systems that are being developed now, with systems that are already been being used at the moment- as Richard said, militaries who have them such as missile defense systems, for example, they already have a lot of practice around maintaining control over these systems so they don’t have unintended effects. So only using them for very short periods of time, in a particular area of the sky where they’re not expecting other kind of unintended objects to be coming into  the  remit of the system.

[00:08:52] So I feel that there’s already quite a lot of understanding in this area amongst states that  control of time and space is important and you also see this in the international policy debate, that’s an area where there’s quite a lot of common grounds, that these are some of the major elements of maintaining human control. And actually this is something that already exists to some extent in military practice.

Uldduz Sohrabi: [00:09:14] If we look  at this from a more practical perspective, what would sufficient control of a system in time and space look like?

Richard Moyes: [00:09:23] I think that is a good question.  It draws us a little bit towards thinking about how you might make rules in this kind of area.

[00:09:29] The first thing to say is that  we don’t think that controlling location and duration are the only factors needed to maintain control. We do think that critical factors, but they’re not the only factors. The user of a system also needs to understand how it works.

[00:09:42] They need to understand that to a sufficient degree. So there’s a    whole set of criteria, , around understandability and explicability of systems  that also need to be met in order to allow for control. But when we do start to look at control of the duration and the space of system functioning,  I think we would recognize that different environments and different systems do present different sorts of challenges. And I think we can’t just  straight forwardly say there’s a one size fits all solution then to what is the correct duration or what is the correct spatial area. So we need to recognize,  we’re going to need to allow some flexibility in relation to these characteristics.

[00:10:18]In a way, both  spatial area and duration are a little bit of a proxy for understandability of the context in which a system is being used. The narrower the time and space of use, the more understandable that context of use is. The bigger it gets, the less understandable it gets- the less predictable it gets.

[00:10:38]Insofar as we have already existing legal rules that need to be met regarding the evaluation of risk to civilians and other factors I feel  like the spatial area and the duration of a system’s functioning- they must be contained or limited or managed sufficiently to allow the human user of that system to apply their legal judgments in a meaningful way.

[00:11:01] So there’s a sort of dynamic relationship between controlling these factors and being able to make meaningful legal judgements and  maybe the kind of rules we just need to emphasize hereare going   pull back  to those  requirements to make sure that commanders or  states are taking on obligations to limit the duration and the spacial  area of system use sufficiently to  apply existing legal rules in an effective way.

Elizabeth Minor: [00:11:29] And a lot of the problems that we see with using kind of sensor-based weapons or the possibility of using sensor-based weapons over a greater duration or large areas or different types of areas is to do with this fundamental problem of uncertainty with these weapons, uncertainty over when and where force will be applied and to what, and the kind of increased risk that, that generates to civilians, but also generally for unintended or unacceptable outcomes happening as a result of using these weapons. So  limiting time and space in those regards is  very important to manage .  For example, we’re thinking about larger areas different types of areas where there might be more civilians or civilian objects present over a larger time when people  may come in and out of an area or objects may come in and  out  of an area where a weapon system is operating and therefore be hit in an unintended way and at a great distance from a legal judgment that has been made about what a commander actually wants to do with the system in terms of the targets that they are going for.

Richard Moyes: [00:12:35] I think, again, those points emphasize the sense that for us this is one of those areas where we need these kinds of obligations in order to prevent  the existing law from becoming opened up and loosened in a way that it starts to be interpreted in a more open way, in a way that actually is just placing greater and greater emphasis on machine functioning and deemphasizing the role of the human, who is actually the legal subject under the law and where the sort of granularity and the specificity of human legal judgements are fundamental to us for making the law- really mean something in practice because the law itself doesn’t explain how long a system can function for or how wide an area it can function over.

[00:13:17] The law doesn’t tell us the answers here. So we need to just make it clear these specific technical factors, they have a bearing on how the law can be effectively implemented, and we need to be pulling towards shorter durations and smaller spaces of time in order to keep the sort of human legal engagement at an adequate level.

Uldduz Sohrabi: [00:13:37] Something that springs to mind when I hear the concept of time and space are two different understandings of these elements as they can be interpreted differently. Perhaps this is something you can clarify. If we take space to start with, I understand that there’s a discussion of where a system is being used, but another interpretation of space that comes to mind is if a system is static or moving.

[00:14:03]In terms of understanding the element of space, is there a separation or form of distinguishing between the two understandings or are we looking at this collectively?

Richard Moyes: [00:14:17] Yeah, I think that’s a good question.  I do think that  space and time here are themselves, just a proxy for understandability and complexity.

[00:14:25] So if a system is static and is not moving around, that is probably gonna make the environment more understandable and is in a way a kind of limitation on the spatial aspects of that systems operation -a little bit in the same way, where if you can turn a system off straightforwardly, or if it has  a fixed duration of operation, then that serves to limit  the period of time of which it’s operating.

[00:14:51] I think we need to recognize that those are ways of achieving some control  over  the space and the time elements in terms of whether they can be  problematic separately, I think- yes, because I think they’re both actually proxies for the same thing.

[00:15:03]Personnel landmines stay generally static. Sometimes they get washed around  by floods  but generally they are intended to be static– but their long duration of operation is  a particular sort of driver of problems and driver of risk. So in that case, you could say it’s the duration first and foremost, that’s problematic.

[00:15:22] It’s a little bit more complicated than that. Maybe when you think of landmines, because  actually losing track of where they are has also been a major problem in that context. So actually, although they’re static, suddenly it turns out they didn’t move, but we still lost track of the spatial components. The users didn’t keep control of the spatial component effectively. And so actually we ended up with a spacial problem as well. So yeah, these things can flow together, in that example.

[00:15:50] Sure I see how they relate. Could you clarify what we mean by time then as for using time itself to gain human control?

Uldduz Sohrabi: [00:16:00] For example, I understand that you say that when we operate a system autonomously for a longer period, we would lose we would see less human control. But how about if the duration is so short for example, systems  operate so rapidly that a human user would lose control over it because of the very fast decision or execution. How do we grapple with this?

Richard Moyes: [00:16:27] There’s  perhaps a slightly different set of dynamics there, because  I think  those issues of how much time and how much information a human user is operating on the basis of- they can be also outside of the sort of mechanics of the system functioning itself.

[00:16:45] But in general we want to be working towards situations where a human user has a sufficient  contextual understanding to make a meaningful judgment. And if either the space of time of a system’s functioning is so broad as  to render that impossible, then  I think that creates a problem. But in general, if a systems user is operating on the basis of information that is simply insufficient to make an informed judgment, That is also a problem,  whether that would fall straightforwardly under our rule set here -I’m not sure. I think that’s an interesting boundary issue.

Elizabeth Minor: [00:17:25] Yeah. And the problems created by high speed systems is  one of these issues in the general area of problems with increasing autonomy in weapons systems or the problems that we’re talking about,  which has to do with controlability in terms of understanding what a system will do. And I suppose the high speed systems that already exist such as missile defense systems, for example the risks created by their kind of high-speed mode of operation is in part managed by only activating them for a short period of time in a very specific context and environment.

[00:18:00]So it intersects with these issues, but in a way it is a slightly different elements of kind of the time that is important.

Richard Moyes: [00:18:07] Of course, those sorts of systems have also caused problems, right? I They’ve shut down civilian airliners and there’s been significant problems as a result. I feel like this is definitely a significant question- the information that people are making decisions on the basis of.

[00:18:21] It may be something where a  broad  overarching principles about the need for people to make sufficiently informed, contextually informed decisions in their legal judgments, we may not be able to, again, specify the the distinct obligations for how much information a person needs and how long they need to have that information for. But I think our legal instruments should certainly be articulating a sense that much of what we’re trying to capture here is that the humans have to be able to be substantively engaged intellectually and in their understanding of the context, if the sorts of legal decisions they’re making are going to be  meaningful. And if we’re going to retain the idea of the law as a meaningful structure of human evaluation and sort of moral application.

Uldduz Sohrabi: [00:19:09] Richard ,Elizabeth, thank you both. Explaining the concepts and elements of time and space for means of more meaningful human control.

[00:19:20] How does Article 36 propose that we regulate this and what is in discussion so far in regards to this element of concern on an international level?

Elizabeth Minor: [00:19:31] In terms of the engagement with these issues at the international level so far I think that, it’s generally accepted in principle amongst a lot of states that controlling time and space is an important element of human control over weapon systems in general. But we haven’t moved yet to formulating rules in this area, of course. We think there’s quite a good kind of engagement with these lines of thinking at the international level so far, but we need specific legal obligations, we would say to ensure that these factors are controlled alongside in a broader structure of regulation that we want to see of having prohibitions on certain systems that are fundamentally unacceptable, that can’t  be controlled or are targeting people as we talk about in other podcasts. We need these positive obligations for maintaining meaningful human control. So  as  Richard said, those aren’t things that can be boiled down to  one size fits all simple rules.

[00:20:30] But I think they need to be talked about in the level of principle and thinking about the things that need to be considered in terms of the time that a sensor-based weapon system is deployed for within which it might apply force to something. The proximity of the force application to a legal judgment therefore, and which affects the continuing validity of that legal judgment that’s been made at the time of release of a system. The size of the area we talked about is relevant in terms of  risk to civilians and other unintended objects and other things. And also the type of area in which a system is deployed, whether it’s populated or what other kinds of environment it is.

[00:21:12]A lot of this is important in thinking about time and space.  I feel there’s also potential pitfalls in this area. If we’re thinking about the type of area in which systems are deployed, you already see in the international debate, perhaps we could draw lines to say, for example, that it would be okay to use autonomous weapons, however, states might be deployed describing those in areas that are seen as open like the sea or a desert or, spaces like that, which I think we’ve seen before and in like actual practice these aren’t empty areas in which you can have a free for all of military engagement and a battlefield, and think about them in these ways.

[00:21:51]And these are also, I think, often deployed as slightly theoretical examples in order to permit practice in other areas and the development of weapon systems, which might be problematic by invoking these kinds of examples. But it is also certainly the case that there’s more risk to civilians if certain weapons are used in populated areas, as opposed to in areas where there’s not high levels of human settlements.

[00:22:14] So I think some of these issues will be important in the policy debate going forward.

Richard Moyes: [00:22:21] Just building on what Elizabeth said,  these issues also bring out why we think the sort of policy structure that we’re promoting is necessary because we recognize that there are certain systems that we think should be prohibited, targeting people, systems that can’t be effectively controlled not effectively understandable.

[00:22:39] And the like, but there are already systems today that function using sensors within a certain area of time and space. And if that was allowed to just expand and get wider and longer than we would be  effectively losing control and failing to ensure control at an adequate level.

[00:22:57] Now, those systems are not perhaps unacceptable per se. So it’s only this question of controlling duration and controlling the spatial area of their functioning that actually allows  them to be used  in an appropriate way. And we need to hold lines against that and to draw some boundaries against that expansion.

[00:23:16] Otherwise just relatively mundane existing technologies could spread into this space.  It’s a reason why we shouldn’t just focus on the idea of extremely complex autonomous systems. We need to recognize that this basic dynamic of sensor input machine calculation, force application without a human setting specifically where and when force applies.    That’s the starting point and these issues around space and time, I think, really explain why we need that to be the starting point for the discussion.

Uldduz Sohrabi: [00:23:45] And that’s all from us at Article 36 for this episode on control of time and space in our autonomous weapons series. We hope you found this discussion useful. Thank you for listening and goodbye.