I should know. I used to work for it. Sending out press releases for artificial intelligence components of the "Star Wars" pie-in-the-sky global defense system that many supposed at the time was one of Ronnie's bedtime-for-bonzo Alzheimer dreams. They said he just hatched the Strategic Defense Initiative out of his own aphasic fantasies one day, sprang it on an unsuspecting world, that there had never been any previous discussion of such a program, even with his closest advisors. If you believe this, then you, like the media pundits of those heady days, are either superbly naive or plain stupid. Not all presidential advisors are attached to the government. Except as the fatted calf of the military-industrial complex is attached to the tit of various federal black budgets. Do you remember which left-wing radical first expressed dire warnings about the dangers of said "military-industrial complex"? It was Dwight D. Eisenhower in his parting speech as President of the United States of America and Commander in Chief of its armed forces. Historian of science David F. Noble rightfully broadened the concept to academic-military-industrial complex. For this and similar views on the often covert complicity of higher education with many highly successful aerospace and related industry interests, he was summarily cut from MIT faculty.
By 1990, after seven years in government AI projects in Japan and commercial AI software startups in the US, I was working at Carnegie Mellon University's Robotics Institute, which was constantly crawling with NASA, DARPA and DOD brass. But we weren't calling it AI anymore. By then it was C3I, for Command, Control and Communications Intelligence. Tactical battlefield shit, theater of operations stuff. The military wanted SkyNet. Bad. Because in that world (as in so many others, some metaphorically real, some realistically metaphorical) the more budget you control, the more budget you get allocated. There's a direct reference to this universal heuristic later on in Terminator 3, when the head of the Joint Chiefs of Staff calls John Connor's intended's daddy, a mere general or some such in the guts of a hardened facility where he has overseen the development and evolution of SkyNet from the start. A SkyNet that has just gone viral -- but which is seen, in a supreme irony that seems fully congruent with the oxymoronic essence of military intelligence, as the magic-bullet cure for the chaos it is, at that very moment ramping up to unleash. The head of the Joint Chiefs says, paraphrasing pretty closely: "get this thing taken care of and you'll get all the funding you ever wanted."
But earlier, nine minutes in from the opening title credit, "Terminator 3: Rise of the Machines," there's a quick scene -- gone in 45 seconds -- that alerts us to that fact that this movie isn't all science fiction. The truly scary part, if we're not so mesmerized by the tits and ass, thrills and spills that we miss it, is the part that's true. Wise-cracking Terminators and wannabe nanobabes traversing time are total bullshit. SkyNet is not. SkyNet is closer than you think. Unless you're one of those already working on it. Are you? (Are you sure?)
So here's the lynch-pin scene in its entirety.
general: OK whatta we got?
tech: This new computer virus is a tricky bastard. It's infected half of the civilian internet as well as secondary military apps. Payroll, inventory...
general: The primary defense nets are still clean?
tech: So far, the firewalls are holding up. Sir, the Pentagon has proposed that we use our AI to scan the entire infrastructure. Search and destroy for any hint of the virus.
general: I know, Tony, but that's like going after a fly with a bazooka.
tech: Well, once the connection's made, it should only be a matter of minutes.
general: Yeah, in which we put everything from satellites to missile silos under the control of a single computer system.
tech: The most intelligent system ever conceived.
general: I still prefer to keep humans in the loop. I'm not sure SkyNet's ready.
tech: Yes, Sir.
When I left CMU a dozen years ago, the man-in-the-loop debate was red hot. The computer scientists wanted him out. Otherwise, their systems would never be able to demonstate how intelligent they were, how fast. Where the ultimate speed of response was, as we used to say back then -- I even said it, may some appropriately angry god strike me dumb -- "mission critical" -- in such cases, human beings could only slow these hypersmart systems down. In an interesting reversal of roles, the military wanted the man left in the loop. Because for the kinds of systems that were being conceived back then, failure would not be graceful. Failure would mean the extinction of life in the only place we know life to exist.
I don't know where "the thinking" is at today. It's highly unlikely that anyone knows, except The Thinkers themselves. And this elite cadre almost surely does not include fools like George W. Bush or anyone even remotely in his confidence.
Let me end this with a little known secret about AI, especially military AI. It can never be tested in real-time, so it doesn't really need to work. If it does, it does. If it doesn't... well, hey. No drawing board to go back to anyway, so what the hell. The funding on the other hand is for today, this week, Q3, let's get it done. Gung ho! But that's not the secret. The secret is this. We don't need intelligent systems to destroy the world. Stupid systems will do the job without complaint. Without, you could say, a second thought. Or a first. And of stupid systems we've got plenty, still multiplying at a geometric pace.
People say to me, "Chris, you seem to seem to have a lot of anger." I say, "No, you think?"