Artificial Intelligence in Weapon Systems Committee Report - Motion to Take Note

Part of the debate – in the House of Lords am 1:46 pm ar 19 Ebrill 2024.

Danfonwch hysbysiad imi am ddadleuon fel hyn

Photo of Earl Attlee Earl Attlee Ceidwadwyr 1:46, 19 Ebrill 2024

My Lords, I am grateful to the noble Lord, Lord Lisvane, for introducing this debate.

I read the report as soon it was published. I agree with it and with the position of HMG and the MoD. However, looking around the corner, I see that reality may conflict with what the report says. Its title is of course very appropriate—although we might wonder how we got it. I am relaxed about the MoD’s reluctance to define AWS. A definition has the danger of excluding certain unanticipated developments.

It may be helpful to the House if I illustrate a potential difficulty with a fully autonomous system, to show why we should not willingly go in this direction. Suppose His Majesty’s Armed Forces are engaged in a high-intensity conflict and an officer is in control of a drone system. He reads his intelligence summary—INTSUM—which indicates fragility in the cohesiveness of enemy forces. The officer controls the final decision for the drone to engage any target, in accordance with our current policy. The drone detects an enemy armoured battalion group but the AFVs are tightly parked in a square in the open, not camouflaged, and the personnel are a few hundred metres away, sitting around campfires. In view of the INTSUM, it would be obvious to a competent officer that this unit has capitulated and should not be engaged for a variety of reasons, not least international humanitarian law. It is equally obvious that a drone with AI might not recognise that the enemy unit is not actively engaged in hostilities. In its own way, the report recognises these potential difficulties.

My concern centres on the current war in Ukraine. Both sides will be using electronic warfare to prevent their opponent being able to receive data from their own drones or give those drones direction. That is an obvious thing to do. But if you are in a war of survival—and the Ukrainians certainly are—and you have access to a drone system with AI that could autonomously identify, select and attack a target, absent any relevant treaty you would have to use that fully autonomous capability. If you do not, you will lose the war or suffer heavy casualties because your enemy has made your own drones ineffective by means of electronic warfare. So long as drones are being used in the current high-intensity conflict, we need to recognise that it will be almost impossible to prevent AI being used fully autonomously. Equally, it will be hard to negotiate a suitable treaty, even if we attach a very high priority to doing so.

The whole nature of land warfare is changing very rapidly—the noble Lord, Lord Lisvane, used the phrase “fast-moving”—and we do not know what the end state will be. However, we can try to influence it and anticipate where it will end up.