Mind Matters Natural and Artificial Intelligence News and Analysis
f35-advanced-military-aircraft-locking-on-target-and-firing-missiles-3d-rendering-stockpack-adobe-stock.jpg
F35 advanced military aircraft locking on target and firing Missile's . 3d rendering
F35 advanced military aircraft locking on target and firing Missile's . 3d rendering Photo by Digital Storm on Adobe Stock

Will AI or Fighter Pilots Win the 2021 Dogfight? Or Both?

The outcome of future warfare will be decided, not by AI alone, but by finding and optimizing the tradeoff between human and artificial intelligence
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

The US Air Force began as Billy Mitchell’s prophecy that air power could decide the next war. That happened, of course, when the B-29 superfortress dropped the atomic bomb on Hiroshima. Ever since, the USAF has sought to adapt to the latest and most decisive forms of military technology. But the challenges have drastically changed.

Most recently the USAF started USCYBERCOMMAND, due to the fact that a lone hacker can paralyze an entire nation’s infrastructure.  The USAF also started the new Space Force, since a well placed orbital burst can take down the world’s satellite grid.

Now USAF is delving into the world of autonomous drones, the next logical step from the manned drones (drones operated by a controller on the ground, not in the plane, as per the video below) that have taken over the skies.

Up till now, the drones have not been considered a complete replacement for in-plane pilots due to slow reaction time. A fighter pilot must react in a split second in order to put the opponent on the defensive, thus “getting within the enemy’s OODA loop.” Manned drones cannot do that because the remote drone pilot’s reaction is too sluggish.

Computers, on the other hand, have a reaction time that is orders of magnitude faster than human reaction time—measured in terms of light speed instead of milliseconds. So, if computers can make intelligent enough decisions in flight, then an autonomous drone could plausibly defeat human fighter pilots.

This thesis will be put to the test in the near future. Steven Rogers of the Air Force Research Lab has a team working on an autonomous fighter drone. The goal is to have a machine vs. man dogfight in July 2021.

He has lead a fundamental research project known as QuEST (Qualia Exploitation of Sensors) to apply the philosophical notion of “qualia” to artificial intelligence. Originally, it was an AI-only project aimed at giving computers some kind of consciousness but has since morphed into developing a set of decision-making criteria that would enable us to know when to apply a human-in-the-loop to AI. Dr. Rogers has successfully created and sold an AI-based startup that uses machine learning to detect breast cancer, using the same trade-off principle.

If Dr. Rogers’s autonomous (or perhaps semi-autonomous) drones succeed, they could usher in a new stage in warfare, where robotic weapons controlled by augmented human intelligence can outperform standard human-controlled weapons.

We saw such a revolution occur after Deep Blue defeated Kasparov. Kasparov held a subsequent competition pitting humans, computers, and human computer hybrid players against each other. The surprising outcome is that neither chess experts nor supercomputers won. Instead, a team of amateurs that augmented their play with a chess engine defeated all other players.

In the same way, the outcome of future warfare will be decided, not by AI alone, but by finding and optimizing the trade-off between human and artificial intelligence.


Further reading:

The brain exceeds the most powerful computers in efficiency. Human thinking takes vastly less computational effort to arrive at the same conclusions. (Eric Holloway)

Why AI can’t win wars as if wars were chess games: Is Vladimir Putin right? Will whoever leads in AI rule the world? It’s not so simple. (Bradley A. Alaniz and Jed Macosko)

Why I doubt that AI can match the human mind: Computers are exclusively theorem generators, while humans appear to be axiom generators (Jonathan Bartlett)

and

Book at a Glance: Robert J.Marks’s Killer Robots


Eric Holloway

Senior Fellow, Walter Bradley Center for Natural & Artificial Intelligence
Eric Holloway is a Senior Fellow with the Walter Bradley Center for Natural & Artificial Intelligence, and holds a PhD in Electrical & Computer Engineering from Baylor University. A Captain in the United States Air Force, he served in the US and Afghanistan. He is the co-editor of Naturalism and Its Alternatives in Scientific Methodologies.

Will AI or Fighter Pilots Win the 2021 Dogfight? Or Both?