AI-Driven Weapons Systems Lead Today's Arms Race

Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

4/15/23, 5:37 PM AI-Driven Weapons Systems Lead Today's Arms Race

How AI Will Revolutionize Warfare


foreignpolicy.com/2023/04/11/ai-arms-race-artificial-intelligence-chatgpt-military-technology

Analysis

The new arms race in technology has no rules and few guardrails.

By Michael Hirsh, a columnist for Foreign


Policy.

hirsh-michael-foreign-
policy-columnist
Michael Hirsh

https://foreignpolicy.com/2023/04/11/ai-arms-race-artificial-intelligence-chatgpt-military-technology/ 1/8
4/15/23, 5:37 PM AI-Driven Weapons Systems Lead Today's Arms Race

An illustration showing a tank firing rows of binary code to represent digital


warfare
Foreign Policy illustration

April 11, 2023, 10:09 AM


When it comes to advanced artificial intelligence, much of the debate
has focused on whether white-collar workers are now facing the sort of
extinction-level threat that the working class once did with robotics.
And while it’s suddenly likely that AI will be capable of duplicating a
good part of what lawyers, accountants, teachers, programmers, and—
yes—journalists do, that’s not even where the most significant
revolution is likely to occur.

The latest AI—known as generative pre-trained transformers (GPT)—


promises to utterly transform the geopolitics of war and deterrence. It
will do so in ways that are not necessarily comforting, and which may
even turn existential.

https://foreignpolicy.com/2023/04/11/ai-arms-race-artificial-intelligence-chatgpt-military-technology/ 2/8
4/15/23, 5:37 PM AI-Driven Weapons Systems Lead Today's Arms Race

On one hand, this technology could make war less lethal and possibly
strengthen deterrence. By dramatically expanding the role of AI-
directed drones in air forces, navies and armies, human lives could be
spared. Already, the U.S. Defense Department is experimenting with AI
bots that can fly a modified F-16 fighter jet, and Russia has been testing
autonomous tank-like vehicles. China is rushing to roll out its own AI-
run systems, and the effectiveness of armed drones will also take off in
coming years. One of the largest, although still nascent, efforts to
advance AI is a secretive U.S. Air Force program, Next Generation Air
Dominance, under which some 1,000 drone “wingmen,” called
collaborative combat aircraft, operate alongside 200 piloted planes.

“I can easily imagine a future in which drones outnumber people in the


armed forces pretty considerably,” said Douglas Shaw, senior advisor at
the Nuclear Threat Initiative. According to retired U.S. Air Force Gen.
Charles Wald, “That’ll be a force multiplier. One of the biggest
problems right now is recruiting.”

On the other hand, AI-driven software could lead the major powers to
cut down their decision-making window to minutes instead of hours or
days. They could come to depend far too much on AI strategic and
tactical assessments, even when it comes to nuclear war. The danger,
said Herbert Lin of Stanford University, is that decision-makers could
gradually rely on the new AI as part of command and control of
weaponry, since it operates at vastly greater speeds than people can.

In a book published this year, AI and the Bomb, James Johnson of the
University of Aberdeen imagines an accidental nuclear war in the East
China Sea in 2025 precipitated by AI-driven intelligence on both the
U.S. and Chinese sides, and “turbo-charged by AI-enabled bots,
deepfakes, and false-flag operations.”

https://foreignpolicy.com/2023/04/11/ai-arms-race-artificial-intelligence-chatgpt-military-technology/ 3/8
4/15/23, 5:37 PM AI-Driven Weapons Systems Lead Today's Arms Race

“The real problem is how little it takes to convince people that


something is sentient, when all GPT amounts to is a sophisticated auto-
complete,” said Lin, a cybersecurity expert who serves on the Science
and Security Board of the Bulletin of the Atomic Scientists. Given AI’s
propensity to hyperbole, Lin said, “when people start to believe that
machines are thinking, they’re more likely to do crazy things.”

In a report published in early February, the Arms Control Association


said AI and other new technologies, such as hypersonic missiles, could
result in “blurring the distinction between a conventional and nuclear
attack.” The report said that the scramble to “exploit emerging
technologies for military use has accelerated at a much faster pace than
efforts to assess the dangers they pose and to establish limits on their
use. It is essential, then, to slow the pace of weaponizing these
technologies, to carefully weigh the risks in doing so, and to adopt
meaningful restraints on their military use.”

U.S. officials have said they are doing so, but they may be navigating a
slippery slope. This January, the Defense Department updated its
directive on weapons systems involving the use of artificial intelligence,
saying that at least some human judgment must be used in developing
and deploying autonomous weapon systems. At the same time,
however, the Pentagon is experimenting with AI to integrate decision-
making from all service branches and multiple combatant commands.
And with the Biden administration cracking down on high-tech exports
to China, especially advanced semiconductors, in order to maintain the
current U.S. lead in AI, the Pentagon is likely to accelerate those efforts.

Wald said, “I do think that AI will help with target prioritization. This
could prove useful in the strategy against China, which owns a home
field advantage over the U.S. in bridging the vast distances in the
Pacific that could interfere with a coordinated response to an attack” on
Taiwan.

https://foreignpolicy.com/2023/04/11/ai-arms-race-artificial-intelligence-chatgpt-military-technology/ 4/8
4/15/23, 5:37 PM AI-Driven Weapons Systems Lead Today's Arms Race

In a 2019 speech, Lt. Gen. Jack Shanahan, the former director of the
Pentagon’s Joint Artificial Intelligence Center, said that while the
Defense Department was eagerly pursuing “integration of AI
capabilities,” this would definitely not include nuclear command and
control. Shanahan added that he could imagine a role for AI in
determining how to use lethal force—once a human decision is made.
“I’m not going to go straight to ‘lethal autonomous weapons systems,’”
he said, “but I do want to say we will use artificial intelligence in our
weapons systems … to give us a competitive advantage. It’s to save lives
and help deter war from happening in the first place.”

The question is whether the Chinese and Russians, along with other
third parties, will follow the same rules as Washington.

“I don’t believe the U.S. is going to go down the path of allowing things
… where you don’t have human control,” Wald said. “But I’m not sure
somebody else might not do that. In the wrong hands with the wrong I
think the biggest concern would be allowing this machine or entity too
much latitude.”

Another concern is that advanced AI technology could allow rogue


actors such as terrorists to gain knowledge in building dirty bombs or
other lethal devices. And AI is now shared by far more actors than
during the Cold War, meaning that it could be used to detect nuclear
arms sites, reducing the deterrent effect of keeping their locations
secret. “AI will change the dynamic of hiding and finding things,” said
Shaw, who noted that much of the data today is held by private
companies that might be vulnerable to AI-driven espionage and
probing of weapons systems.

What seems clear is that a new AI arms race is underway, and there is
probably little that can be done to stop it. In an open letter in late
March, more than 2,000 technology leaders and researchers—

https://foreignpolicy.com/2023/04/11/ai-arms-race-artificial-intelligence-chatgpt-military-technology/ 5/8
4/15/23, 5:37 PM AI-Driven Weapons Systems Lead Today's Arms Race

including Elon Musk and Steve Wozniak—called on labs worldwide to


pause in training up the newest digital intelligence models because of
cascading fears that they could lead to disaster for the human race.

Their warning of an existential threat to society may be somewhat


exaggerated; the letter was criticized by some AI experts as
“fearmongering” over the illusion that these models can ever be
sentient, as in sci-fi movies like The Terminator. “It’s not Skynet,” said
Lin, referring to the AI villain that destroyed human civilization in the
famous 1984 movie. “But it could lead to human error based on
overdependence. It’s a precision-guided munition against human
intelligence and human rationality.”

Moreover, the idea that governments are going to sanction a delay for
safety’s sake is unlikely in the extreme. This is not only because the
world’s biggest tech companies are engaged in vicious competition,
especially in Silicon Valley, but also because the new technology is
being rolled out in an international environment in which the U.S.,
China, and Russia are now embroiled in a grim struggle for dominance.

“It’s important to remember that the enemy gets a vote,” said retired
Air Force Lt. Gen. David Deptula. “Even if we stopped autonomy
research and other military AI development, the Chinese and to a lesser
degree Russians will certainly continue their own AI research. Both
countries have shown little interest in pursuing future arms control
agreements.”

The open letter was only the latest evidence of what can only be called a
widespread panic since ChatGPT appeared on the scene late last fall
and major tech companies scrambled to introduce their own AI systems
with so-called human-competitive intelligence. The issues at stake, the
letter said, were fundamental to human civilization: “Should we let
machines flood our information channels with propaganda and

https://foreignpolicy.com/2023/04/11/ai-arms-race-artificial-intelligence-chatgpt-military-technology/ 6/8
4/15/23, 5:37 PM AI-Driven Weapons Systems Lead Today's Arms Race

untruth? Should we automate away all the jobs, including the fulfilling
ones? Should we develop nonhuman minds that might eventually
outnumber, outsmart, obsolete and replace us? Should we risk loss of
control of our civilization?”

But the key line was this one: If the companies that fund AI labs don’t
agree to such a pause, then “governments should step in and institute a
moratorium.”

As far back as 2017, though, Russian President Vladimir Putin declared


that “the one who becomes the leader in this sphere [AI] will be the
ruler of the world,” and that future wars will be decided “when one
party’s drones are destroyed by drones of another.”

The biggest problem, said Shaw, of the Nuclear Threat Initiative, is that
“we’re not really having a conversation about what’s next.”

Many others agree. DARPA—the Pentagon’s Defense Advanced


Research Projects Agency—is conducting a wide-ranging research
program called AI Forward. DARPA spokesman Matt Turek said
recently that generative AI systems such as ChatGPT have raised
serious new issues about “the level of reliability that we probably need
… for life or death decision-making.”
Michael Hirsh is a columnist for Foreign Policy. He is the author of
two books: Capital Offense: How Washington’s Wise Men Turned
America’s Future Over to Wall Street and At War With Ourselves:
Why America Is Squandering Its Chance to Build a Better World.
Twitter: @michaelphirsh

Tags: Military, Science and Technology, United States, Weapons


See All Stories

https://foreignpolicy.com/2023/04/11/ai-arms-race-artificial-intelligence-chatgpt-military-technology/ 7/8
4/15/23, 5:37 PM AI-Driven Weapons Systems Lead Today's Arms Race

France’s Macron Wins Pyrrhic Victory on Pension Reform

https://foreignpolicy.com/2023/04/11/ai-arms-race-artificial-intelligence-chatgpt-military-technology/ 8/8

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy