Friday, January 14, 2022

Human-Machine Teaming and Autonomous Lethal Weapons Systems

I've been doing a lot of reading - and thinking - lately about autonomous lethal weapons systems. I've never helped develop one, but certainly the skills required to do so are in my wheelhouse. I'm philosophically opposed to them; I'm a fan of Isaac Asimov's Three Laws of Robotics. Yet I also believe that autonomous weapons systems are inevitable, and probably necessary. Such systems - e.g. armed autonomous flying drones used in land or sea battles - can go-where and do-what humans cannot. Physics is ruthless.

If our adversaries use them, I don’t see that we will have any choice but to do so as well in order to remain competitive on the battlefield. There’s a strong economic (and possibly even humanitarian, if they can reduce human error and danger to civilian populations) incentive to use autonomous lethal weapons. They will be particularly attractive to smaller first-world states, or any organization exploiting asymmetric warfare. Such systems may be fully autonomous, partially autonomous, or optionally autonomous. Combining a human operator with automation (a term I prefer to "artificial intelligence") is a form of human-machine teaming (HMT).

Lots of people in other domains are thinking about this. In its SAE J3016 standard, the Society of Automotive Engineers defines six levels of driving automation for vehicles, ranging from 0 (fully manual) to 5 (fully autonomous).

(Click on the image to see a larger version.)

It occurs to me that this might be applied to weapons systems as well. Here are some ways J3016 might be adapted to apply to weapons systems.

  • 0 - the human operator has full control over the weapon system at all times.
  • 1 - automation may assist the human operator with targeting, stabilization, etc.
  • 2 - the human operator may relinquish control to the automation but can override its decisions.
  • 3 - the automation may take control if it detects the human operator is impaired.
  • 4 - the automation operates the weapon but a human operator is still required to approve a kill decision.
  • 5 - the automation makes the kill decision without any human guidance or approval.

(H/T to John Stuckey for the inspiration for this.)

2 comments:

Fazal Majid said...

We already have autonomous lethal weapons: they are called landmines, and have a terrible legacy of dead and maimed.

Chip Overclock said...

That is an excellent point. Thank you for bringing it up.