ROME – As the use of autonomous weapons systems becomes an increasingly prominent feature of modern warfare, including in both Ukraine and Gaza, the Vatican is backing new restrictions on these AI-driven "lethal autonomous weapons," often known by the acronym "LAWs."
Italian Archbishop Ettore Balestrero, the Vatican's Permanent Observer to the United Nations in Geneva, argued for a moratorium on the development of weapons systems that can make firing decisions without human intervention — often dubbed "killer robots" — during an Aug. 26 address to a group of experts meeting in the Swiss city.
"It is profoundly distressing," Balestrero said, "that, adding to the suffering caused by armed conflicts, the battlefields are also becoming testing grounds for more and more sophisticated weapons." In particular, he insisted that autonomous weapons systems can never be considered "morally responsible entities."
"The human person, endowed with reason, possesses a unique capacity for moral judgement and ethical decision-making that cannot be replicated by any set of algorithms, no matter how complex," Balestrero said, according to a report in Vatican News, the Vatican's official state-sponsored news agency.
"What the companies are creating is technology that makes human judgment about targeting and firing increasingly tangential," that report said. "The widespread availability of off-the-shelf devices, easy-to-design software, powerful automation algorithms and specialized artificial intelligence microchips has pushed a deadly innovation race into uncharted territory, fueling a potential new era of killer robots."
Even before the current fighting with Hamas began, Israel was employing an automated sentry gun system to monitor the border fence with Gaza, capable of identifying targets and suggesting action without human intervention. It also uses an AI system called "Habsora" to identify bombing targets inside Gaza, which is said to be capable of doing so at a rate much higher than manual detection.
In that context, Balestrero made a distinction between a "choice" and a "decision," arguing that the latter is a human act that involves weighing ethical considerations such as human dignity.
He called for international agreements governing the use of LAWs, saying they're necessary "to ensure and safeguard a space for proper human control over the choices made by artificial intelligence programs: human dignity itself depends on it."
"No machine should ever make the decision to take a human life," Balestrero said.
When the inevitable SKYNET end comes, don't say we didn't warn you.
Jump in the discussion.
No email address required.
@RWBY please make G Gundam marseys we need a Dark Gundam one.
Jump in the discussion.
No email address required.
AWAP
Jump in the discussion.
No email address required.
That one aired on American TV in the afternoons 20+ years ago when I was 10-11 years old.
Jump in the discussion.
No email address required.
Corinthian is very old
Jump in the discussion.
No email address required.
Yeah i am too also redactor. Sometimes u need older men to guide you right.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
More options
Context
Jump in the discussion.
No email address required.
More options
Context
More options
Context