Read The Pentagon's Brain Online
Authors: Annie Jacobsen
Tags: #History / Military / United States, #History / Military / General, #History / Military / Biological & Chemical Warfare, #History / Military / Weapons
In government, it is a generally accepted rule that someone has to take the blame when government fails. For DARPA, whose job it was to safeguard the nation from technical surprise, there was no clear mission failure on 9/11, at least not in the public eye. The weapons used by the terrorists were fixed-blade utility knives, invented during the Great Depression. The flint knife, prehistory’s utility blade, is roughly 1.4 million years old. Al Qaeda used American technology against America, hijacking four fully fueled aircraft and successfully piloting three of them, as missiles, to their targets. It is believed that Al Qaeda spent less than $500,000 planning and executing the attacks.
The public’s perception, generally, was that the intelligence community was to blame for 9/11, a surprise attack that rivaled Pearl Harbor in its death toll and future consequence. Most fingers were pointed at the CIA and the FBI. Because the National
Security Agency maintained a lower public profile at the time, it was not held accountable to the same degree. History has made clear, however, that errors by the NSA were indelible. On September 10, 2001, it intercepted from terrorists, already being monitored by the NSA, two messages in Arabic.
“The match is about to begin,” read one message.
“Tomorrow is zero hour,” read the other message.
The sentences were not translated until September 12. “In fact these phrases [might] have not been translated with such a quick turnaround had the horrific events not happened,” in-house DARPA literature notes. DARPA is responsible for much of the technology behind advanced information collection as well as real-time translation capabilities. In the wake of 9/11, DARPA rapidly began to advance these technologies, and others related to them, so its partner, NSA, could do its job better.
Despite all the advanced technology at the disposal of the U.S. government, the national security establishment did not see the September 11, 2001, terrorist attack coming. Nor was its arsenal of advanced technology able to stop the attack once it began. As a consequence, the American military establishment would begin a hyper-militarization not seen since the explosion of the 15-megaton Castle Bravo hydrogen bomb on Bikini Atoll in 1954.
T
he nuclear physicist John Poindexter is rarely noted for his prowess in nuclear physics. Instead he is almost always referred to as the retired Navy admiral and former national security advisor to President Ronald Reagan during the Iran-Contra affair who was convicted on five felony counts of lying to Congress, destroying official documents, and obstructing congressional investigations.
The day after the terrorist attacks of 9/11, Poindexter was pulling his car out of the quiet suburban subdivision where he lived outside Washington, D.C., when he was struck with an idea for DARPA. He had worked for the agency before, as a defense contractor in the late 1990s. By then Poindexter’s Iran-Contra notoriety had died down, and he was able to return to public service. A U.S. court of appeals had reversed all five of Poindexter’s felony convictions on the grounds that his testimony had been given under a grant of immunity.
In the decade after the scandal, Poindexter put his focus into computer technology. Because he had retained his full Navy pension after Iran-Contra, he did not have to look for a job. Fascinated by
computers, Poindexter began teaching himself computer programming languages, and soon he could write code. In 1995, through a defense contractor called Syntek, Poindexter began working on a DARPA project called Genoa. The goal of Genoa was to develop a complex computer system—an intelligent machine—designed to reach across multiple classified government computer databases in order to predict the next man-made cataclysmic event, such as a terrorist attack. Poindexter, a seafaring man, especially liked Genoa’s name. A genoa is a boat’s jib, or foresail, typically raised on a sailboat to increase speed.
Poindexter’s boss on the project, the person in charge of all “next-generation” information-processing ideas at DARPA in the late 1990s, was a man named Brian Sharkey. After a little more than a year working on the project, Syntek’s contract ended. Poindexter and Sharkey had gotten along well during phase one of Genoa and kept each other’s contact information. The way Poindexter tells the story, on the morning after the 9/11 terrorist attacks, he was struck with the idea that the time had come to revitalize the Genoa program. He pulled his car to the side of the road and began scrolling through contacts on his cell phone until he found Brian Sharkey’s number.
“That’s funny,” Poindexter recalls Sharkey saying to him. “I was just thinking about calling you.”
Both men agreed that it was time to accelerate the Genoa program. Sharkey had left DARPA to serve as senior vice president and chief technology officer for the California-based defense contracting giant Science Applications International Corporation, or SAIC. With so many surveillance-related defense contracts on its roster, SAIC was often jokingly referred to as NSA West. Another one of SAIC’s prime clients was DARPA. Brian Sharkey knew the current DARPA director, Tony Tether, quite well.
“We need to talk to Tony,” Poindexter told Sharkey.
In Washington, Tony Tether was well regarded as a top innovator. Someone who saw the future and made it happen. When he was
serving as director of DARPA’s Strategic Technology Office, back in the 1980s, he advocated maximizing technology for surveillance capabilities. Now, two decades later, these kinds of technologies had advanced exponentially. In this post-9/11 environment, Tether’s enthusiasm for, and experience in, surveillance collection would prove invaluable in his role as DARPA director.
Brian Sharkey and Tony Tether knew each other from SAIC. In the 1990s, after leaving government service for defense contracting, Tether had served as vice president of SAIC in its advanced technology sector. Now Sharkey was a senior vice president at SAIC. During the September 12 phone call between Sharkey and Poindexter, the men agreed that Sharkey would set up a meeting with Tether to discuss Genoa.
Since 1995, DARPA had spent roughly $42 million advancing the Genoa concept under the Information Systems Office. The program was part of a concept DARPA now called Total Information Awareness (TIA). But the existing Genoa program was nowhere near having the “intelligence” necessary to recognize another 9/11-style plot. Poindexter and Sharkey aimed to change that.
The following month, on October 15, 2001, Sharkey and Tether met at a seafood restaurant in Arlington, Virginia, Gaffney’s Oyster and Ale House, to discuss Total Information Awareness. Tether embraced the idea, so much so that he asked Brain Sharkey to leave his job at SAIC and return to DARPA to lead the new effort. But Sharkey did not want to leave his job at SAIC. The corporation was one of the largest employee-owned companies in America, and Sharkey had accumulated considerable stock options. If he were to return to government service, he would have to let go of profit participation. John Poindexter was the man who should serve as the director of the Total Information Awareness program, Sharkey said. SAIC could act as DARPA’s prime contractor.
A few days later, Sharkey and Poindexter went sailing on Poindexter’s yacht,
Bluebird,
to discuss next steps. Poindexter later recalled
feeling excited. He had big ideas. He believed he knew exactly how extensive this program had to be to succeed. Poindexter knew what the subtitle of the program should be. In his pitch to Tether, his opening slide would read “A Manhattan Project on Countering Terrorism.” Artificially intelligent computers were the twenty-first century’s atomic bomb.
Tether had Poindexter come to his office at DARPA and present the slide show. Poindexter’s background was in submarines, and there was an analogy here, he told Tether. Submarines emit sound signals as they move through the sea. The 9/11 hijackers had emitted electronic signals as they moved through the United States. But even if the NSA had been listening, its system of systems was not intelligent enough to handle the load in real time. The hijackers had rented apartments, bought airplane tickets, purchased box cutters, received emails and wire transfers. All of this could have been looked at as it was happening, Poindexter said. Terrorists give out signals. Genoa could find them. It would take enormous sums of time and treasure, but it was worth it. The 9/11 attacks were but the opening salvo, the White House had said. The time was right because the climate was right. People were terrified.
Tony Tether agreed. If John Poindexter was willing to run the Information Awareness Office, DARPA would fund it. In January 2002 the Information Awareness Office was given the green light to proceed, with a colossal initial start-up budget of $145 million and another $183.3 million earmarked for the following year. John Poindexter was now officially DARPA’s Total Information Awareness czar.
“In our view, information technology is a weapon,” says Bob Popp, the former deputy director of the Information Awareness Office, John Poindexter’s number two. Popp is a computer scientist with a Ph.D. in electrical engineering, a prolific author and patent holder. He rides a motorcycle and is an active participant in and lifetime
member of HOG, or Harley Owners Group. His areas of expertise include anti-submarine warfare and ISR (intelligence, surveillance, and reconnaissance). When he was a younger man, Popp welded Trident nuclear submarines for General Dynamics.
Before 9/11, “information technology was a huge unexploited weapon for analysts,” Popp says. “They were using it in a very limited capacity. There were a lot of bad guys out there. No shortage of data. Analysts were inundated with problems and inundated with data. The basic hypothesis of TIA was to create a system where analysts could be effective. Where they were no longer overwhelmed.”
It was Bob Popp’s job as John Poindexter’s deputy to oversee the setting up of multiple programs under the TIA umbrella. The Evidence Extraction and Link Discovery program (EELD) was a big office with a large support staff. Its function was to suction up as much electronic information about people as possible—not just terror suspects but the general American public. The electronic information to be gathered was to include individual people’s phone records, computer searches, credit card receipts, parking receipts, books checked out of the library, films rented, and more, from every military and civilian database in the United States, with the hope of determining who were the terrorists lurking among ordinary Americans. The primary job of the EELD office was to create a computer system so “intelligent” it would be able to review megadata on 285 million people a day, in real time, and identify individuals who might be plotting the next terror event.
In 2002, DARPA senior program manager Ted Senator explained how EELD would work. The plan, Senator said, was to develop “techniques that allow us to find relevant information—about links between people, organizations, places, and things—from the masses of available data, putting it together by connecting these bits of information into patterns that can be evaluated and analyzed, and learning what patterns discriminate between legitimate
and suspicious behavior.” It was not an easy task. Using the needle-in-the-haystack metaphor, Senator explains just how hard it was. “Our task is akin to finding dangerous groups of needles hidden in stacks of needle pieces. This is much harder,” he points out, “than simply finding needles in a haystack: we have to search through many stacks, not just one; we do not have a contrast between shiny, hard needles and dull, fragile hay; we have many ways of putting the pieces together into individual needles and the needles into groups of needles; and we cannot tell if a needle or group is dangerous until it is at least partially assembled.” So, he says, “in principle at least, we must track all the needle pieces all of the time and consider all possible combinations.”
Because terrorists do not generally act as lone wolves, a second program would be key to TIA’s success, namely, the Scalable Social Network Analysis. The SSNA would monitor telephone calls, conference calls, and ATM withdrawals, but it also sought to develop a far more invasive surveillance technology, one that could “capture human activities in surveillance environments.” The Activity Recognition and Monitoring program, or ARM, was modeled after England’s CCTV camera. Surveillance cameras would be set up across the nation, and through the ARM program, they would capture images of people as they went about their daily lives, then save these images to massive data storage banks for computers to examine. Using state-of-the-art facial recognition software, ARM would seek to identify who was behaving outside the computer’s pre-programmed threshold for “ordinary.” The parameters for “ordinary” remain classified.
Facial recognition software expert Jonathan Phillips was brought on board to advance an existing DARPA program called Human Identification at a Distance. Computer systems armed with algorithms for the faces of up to a million known terrorists could scan newly acquired surveillance video, captured through the ARM program, with the goal of locating a terrorist among the crowd.
TIA was a many-tentacled program. The problem of language barriers had also long been a thorn in the military’s side. DARPA needed to develop computer-based translation programs in what it called “the war languages,” Arabic, Pashto, Urdu, Dari, and other Middle Eastern and South Asian dialects. Charles Wayne was brought on board to run two programs, TIDES and EARS, to develop computer programs that could convert foreign languages to English-language text. There would be a war games effort inside TIA, too, called War Gaming the Asymmetric Environment, and led by Larry Willis. In this office, terrorism experts would create fictional terror networks, made up of individual characters, like avatars, who would begin plotting fake terror attacks. The point was to see if TIA’s myriad of surveillance programs, working in concert, could identify the avatar-terrorists as they plotted and planned. To further this effort, a group inside the group was formed, called the Red Team, headed by former DARPA director Stephen Lukasik. Red teaming is a role-playing exercise in which a problem is examined from an adversary’s or enemy’s perspective.
Finally there was Genoa II, the centerpiece of the program, the software that would run the system of information systems. Its director, Thomas P. Armour, described Genoa II as a “collaboration between two collaborations.” One group of collaborators were the intelligence analysts, whose goal was “sensemaking,” Armour said. These collaborators had the tricky job of collaborating among themselves, across multiple organizations, including the CIA, NSA, DIA, and others. It was the job of the sensemakers to construct models or blueprints of how terrorists might act. This group would then collaborate with “policymakers and operators at the most senior level,” who would evaluate the intelligence analysts’ work and develop options for a U.S. response to any given situation. Genoa II, Armour told his team, “is all about creating the technology to make these collaborations possible, efficient, and effective.”
To Armour, there was hardware, meaning the machinery,
software, meaning the computer programs, and wetware, meaning the human brain. Armour saw the wetware as the weakest link. The challenge was that intelligence agencies historically preferred to keep high-target terrorist information to themselves. “The ‘wetware’ whose limitations I mentioned is the human cognitive systems,” Armour told defense contractors who were bidding on the job. “Its limitations and biases are well documented, and they pervade the entire system, from perception through cognition, learning, memory, and decision,” Armour told his team. In this system of systems, which was based on collaborative efforts between humans and their machines, Armour believed that the humans represented the point where the system was most vulnerable. “These systems,” said Armour, referring to human brains, “are the product of evolution, optimized by evolution for a world which no longer exists; it is not surprising then that, however capable our cognitive apparatus is, it too often fails when challenged by tasks completely alien to its biological roots.”
Unlike so many of the new technologists working on TIA, Tom Armour was a Cold Warrior. He was also a former spy. After flying combat missions during Vietnam as a U.S. Air Force navigator on the AC-119K gunship, he began a long career with the CIA, starting in 1975. Armour was an expert on Soviet nuclear weapons systems, missile technology, and strategic command and control. At the CIA, under the Directorate of Intelligence, he served as chief of computing and methodological support, bringing the agency into the twenty-first century with computers for intelligence analysis.