In a recent podcast, Walter Bradley Center director Robert J. Marks spoke with Robert D. Atkinson and Jackie Whisman at the prominent AI think tank, Information Technology and Innovation Foundation, about his recent book, The Case for Killer Robots—a plea for American military brass to see that AI is an inevitable part of modern defense strategies, to be managed rather than avoided. The book may be downloaded free here.
The discussion of the state of AI research begins at 17:48 (A portion of the transcript follows. The whole transcript is here. Notes and links follow below.)
Rob Atkinson: You mentioned earlier, World War II and all the amazing innovations really that stemmed from World War II, radar being one later on semiconductors, Intel, for example, most of its sales initially of semiconductors, which were super expensive to the Air Force for guided missiles, GPS, aviation, the Internet was really DOD-funded. And obviously we relied then on great entrepreneurs and the commercial marketplace to take these innovations and movement into the economy. Today’s a little bit different in the sense of there’s now, what used to be called spinoff. DOD was so important that they’d have technology they’d spin off into the commercial sector.
Now with AI, a lot of work is being done in the commercial sector, but still when DOD funds these technologies, whether they’re at universities or other places or startups through companies, they support, they do support AI innovation. I think that’s one of our concerns that if we say that the whole defense world, and maybe even the Intel world, can’t be involved in AI, that it’s going to at one level, deter, slow innovation, particularly more radical innovation that universities tend to engage in. What are your thoughts on that?
Robert J. Marks: Well, the US as you know, has had recent policies which give a lot of money to universities in order to develop artificial intelligence-based systems and new technology. Unfortunately, as a many-year member of academia, I see a lot of this wasted. I see that the money going to high-profile places, places with big brands, is not probably a good place for the money that is being used in artificial intelligence research today. Unfortunately, that’s the way that it’s been done and that’s the way it’s being done currently. We had a visit to Baylor by General Murray. He’s a four-star general who was interested in development of technology for the Army.
And he mentioned as he went around to the different labs to look at the research being done here at Baylor University, he says, “I don’t want to see your papers.” And unfortunately, we talked about my upcoming book, Supply Side Academics. The currency of universities today is the funding and publication of papers. There is little interest in figuring out what to do with the military or the private sector. Not totally, but the emphasis is on those two things, publication of papers and generating money. And unfortunately, I think that’s a terrible economics for universities and is something which should be hopefully reversed or mitigated.
Note: Prof. Marks has also written a thoughtful essay on a related topic in terms of low performance at universities: “Why it’s so hard to reform peer review.” There, he applies Goodhart’s Law and Campbell’s Law:
Goodhart’s Law: “When a measure becomes a target, it ceases to be a good measure.” Think: Lots of papers is the measure of productivity.
Campbell’s Law: “The more any quantitative social indicator is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor.” Think: Papers based on Big Data may be based on phantom patterns, producing nonsense in peer-reviewed journals.
Here’s Part 1: Is the U.S. military falling behind in artificial intelligence? What is the likely outcome of allowing those with very different value systems to have control of global AI warfare technology? Robert J. Marks told Information Technology and Innovation Foundation, an AI think tank, that AI superiority can deter or shorten wars, thus reducing overall casualties.
Part 2: AI is not nearly smart enough to morph into the Terminator. Computer engineering prof Robert J. Marks offers some illustrations in an ITIF think tank interview. AI cannot, for example, handle ambiguities like flubbed headlines that can be read two different ways, Dr. Marks said.
Part 4: Computer prof: Feds should avoid university, seek entrepreneurs. Too much time at the U is wasted on getting papers into theoretical journals, not enough time spent on innovation, he contends. Robert J. Marks, author of Killer Robots and the forthcoming Supply Side Academics, wants a bigger focus on developing practical technologies.
You may also wish to look at:
Russia is systematically copying U.S. military AI robotics. In Russia’s topdown system, the military and corporations are essentially part of the same enterprise.
- 01:19 | Introduction to the podcast topic
- 02:13 | Introducing Dr. Robert J. Marks
- 03:38 | AI in military applications
- 05:07 | Staying ahead in development
- 06:31 | Major areas of AI in the military
- 07:10 | Drone swarms
- 09:26 | Will AI be sentient?
- 11:30 | Autonomous weapons
- 16:07 | Ethics
- 17:48 | The state of AI research
- 20:31 | Top priority in tech policy
- Get a free copy of The Case for Killer Robots by Robert J. Marks
- Original podcast at ITIF
- ITIF’s website
- Walter Bradley Center on Natural and Artificial Intelligence