Note: This article was updated for accuracy on Tuesday, Dec 17.
You’re running, but not fast enough. You can hear it behind you, and feel its cold, infrared sensor burning into your back.
A salvo of bullets explodes into your back, each 7.62x51mm NATO round fired by a series of ones and zeros, without regard for weak, soft, human concerns like morals or pain.
Everything goes black as you see your assailant, a Page Industries Bravo-3 Peacebringer robot, strut past you, having won another game of cat and mouse.
While the mechanized threats of Deus Ex, RoboCop and Wolfenstein might seem like far-off science fiction, there are many out there who argue they are actually science fact, and have been for some time. That is why the International Committee of the Red Cross (ICRC) has branched out into game development with a Facebook chatbot project about Lethal Autonomous Weapons (LAWs), electronic weapons systems with enough artificial intelligence to operate on their own. Real-world examples of this include the US Navy’s unmanned Sea Hunter submarine and Robotic Combat Vehicles (RCVs) in testing stages for the US Army.
“There’s already a lot of development with artificial intelligence being used in a lot of weapons,” said Nora Livet, the ICRC’s project manager behind the game. “Especially on the borders; there is one functioning in the border between the two Koreas.”
Livet’s game, which does not have an official name, revolves around a similar weapons platform calling itself “A.I.D.A, an Automated Infantry Defence Analysis Unit.”
The player’s interaction with the bot is almost entirely text-based; players can choose from two dialogue options, which either allow for collecting information or making moral choices and statements.
The ICRC found itself in the unusual role as a game publisher when it reached out to Catch Digital, a digital marketing company based in London, to develop the project.
“We worked closely with them to to develop the storyboard, but also my colleagues in the department [of the ICRC] that’s working on artificial intelligence and weapons,” said Livet.
The entirely text-based interaction may remind some of Twine, a leading engine for text games. However, it is actually hosted using a program called Spectrm, which is more of a marketing tool.
Livet said the bot has a 47 percent retention rate, meaning 47 percent of people who start the game see it to completion.
“And the the idea behind this is, yes, to make people aware of this, to make people think about this ethical question [of artificially intelligent weapons],” said Livet. “But also, we are actually tracking how people answer. Right now, I can tell you actually how people answer the important questions that we have.”
Chief among these questions is whether the player is happy with AIDA doing its work, and whether they can stomach it targeting a civilian. Livet said about 60 percent of the players choose to take control back to prevent a potential noncombatant from being harmed. This data is being given to the market research company Ipsos, which is working on building a narrative around this data. Additionally, players are randomly selected both before and after playing the game to take surveys about their views on the ethics of artificially intelligent weapon systems. The ICRC will release a report at the end of December, once the data is analyzed.
Livet herself does not have much experience in the field of game publishing, but she said the ICRC has worked with Bohemia Interactive to create Arma III‘s Laws of War DLC. The proceeds from the sales of the DLC raised $176,667 for the organization. The ICRC is working on similar projects, but they are in the very early stages.
Catch Digital has a large number of projects under their belt, like their Shorty Awards-winning GoTBot, a chat bot that spews Game of Thrones trivia. However, this is the first they’ve done with the ICRC.
“We put together a few different scenarios and storyboards,” said Lily Bartholomew, Catch Digital’s project manager. “We try to use digital [media] in fun and creative ways, so it definitely appealed to us.”
Becca Bendelow, Catch Digital’s delivery lead, emphasized the importance of this game being fun and engaging. This means the ICRC can gain a large enough sample size for its research project.
“There’s actually no laws that govern autonomous weaponry at the moment,” Bendelow said. “There is no legal standpoint on who to blame, and who should be responsible for the programming and the kind of ‘shoot to kill’ messaging. There’s nothing that clearly states how far that autonomous weaponry should be allowed to go, and who’s responsible should something go wrong.”
If all goes as planned, the ICRC’s report should be done at the end of December, and may be publicly released as a white paper.