Skip to main content

Can AI In Warfare Be Governed Like Nuclear Weapons?

In WIRED’s War Machine livestream, AI Lab newsletter writer Will Knight explores what can be done to limit lethal decision making by AI in global conflict. Spoiler alert: you might not find his answer wholly reassuring.

Released on 03/27/2026

Transcript

I wouldn't hold out much hope for that.

You know, there are discussions at the UN,

where there is proposals to have such a sort of prohibition

on lethal autonomous systems.

so systems that take a person out of the loop

when making a decision to take a life.

But the US is not signed up for that,

and the reality is, I think that,

often is the case at the moment at least,

an absent, more pushback maybe from the public

and other quarters that the US believes

it needs to develop these,

at least have the ability to explore those sorts

of technologies, and other countries do as well.

The fact that AI is a lot more accessible,

and the technology's required to have greater autonomy

do make it much more likely

that you'll see a lot more nations sort of develop

and deploy those systems.