Read Kill Chain: The Rise of the High-Tech Assassins Online
Authors: Andrew Cockburn
Tags: #History, #Military, #Weapons, #Political Science, #Political Freedom, #Security (National & International), #United States
Back in the distant days of the Carter administration, high-tech apostle William Perry, occupying the powerful post of director of defense research and engineering, had begun pouring huge money into stealth. The money spigot has stayed open ever since, although the actual results in combat have fallen short of expectations (and public boasts). The F-117, the first stealth bomber, had in fact proven visible enough to enemy radars in the 1991 Gulf War to require escort by fleets of radar-jamming aircraft, while in the 1998 Kosovo conflict, the Serbs had managed to use elderly Soviet SAM radars to locate and shoot down one F-117 and severely damage another. Nevertheless, stealth continued to be deemed essential to any scheme for using drones, or any other aircraft, in the face of determined enemy air defenses such as the dreaded A2/AD so frequently invoked in official discussions of future weapons systems. The incorporation of stealth features—exotic plastic and glue coatings to absorb radar waves and special airframe shaping to deflect them—makes drones and planes not only less airworthy and less maneuverable but also enormously costly, which means, given the cost-plus business model of the defense industry, they could be enormously profitable. Drones, once hailed for their cheapness, were inevitably becoming more expensive than the manned planes they were supposed to replace.
Given their cost, these superdrones will inevitably be few in number and will require a more elaborate “reach-back” communications network with an ever-more voracious appetite for bandwidth (the amount of data that can be transmitted over a communications link). Already, a single Global Hawk drone requires five times as much bandwidth as that used by the
entire
U.S. military during the 1991 Gulf War, an amount that will only increase. With this Niagara of information pouring across the heavens from satellite to ground stations and up to satellites again comes the certainty that someone will at some point listen in and may well be capable of inserting their own commands to the machine.
In 2009, Shia insurgents in Iraq used SkyGrabber software, priced at $29.95 on the Internet, to capture and download Predator video feeds for use in their own battle planning. More spectacularly, in December 2011, an RQ-170 Beast overflying Iran landed comparatively undamaged. Initial denials by U.S. authorities of Iranian claims that they had captured the drone were silenced when the Iranian Revolutionary Guard put it on display. Like all drones, the machine relied on GPS for navigation, the network of satellites that had made remote drone operations possible in the first place. But GPS signals are extraordinarily weak, the equivalent of a car headlight shining 12,000 miles away, because the size of the satellite limits the power output. This makes it comparatively easy to jam or interfere with the signals, which is what the Iranians claim to have done.
As an Iranian engineer explained to
Christian Science Monitor
reporter Scott Peterson, the Iranian electronic-warfare specialists had the benefit of their experience working on the remains of several simpler U.S. Navy Scan Eagle drones they had retrieved earlier. To get control of the CIA’s drone, they first jammed its communication links to the pilots back in Nevada. “By putting noise [jamming] on the communications, you force the bird into autopilot,” explained the engineer. “This is where the bird loses its brain.” Once that happened, the aircraft was preprogrammed to return to its Kandahar base, navigating by GPS. But at this point the Iranians made their second intervention, feeding false signals that mimicked the weaker GPS transmissions, and gradually guided the aircraft toward an Iranian landing site. As an electronic-warfare commander explained to an Iranian news agency, “[A]ll the movements of these [enemy drones]” were being watched, and “obstructing” their work was “always on our agenda.” The landing site was carefully chosen, as the engineer explained, because it was at almost exactly the same altitude as Kandahar. So, when the drone “thought” it was at its home base, it duly landed. However there was an altitude difference of a few feet. Landing heavily, the aircraft damaged its undercarriage and one wing.
Despite energetic attempts by U.S. officials to discredit the Iranian claims, there is no reason to doubt the story, especially as their feat was later duplicated by a University of Texas professor, Todd Humphreys, in repeated public experiments in which he took control of nonmilitary drones and, on one occasion, a large yacht in the Mediterranean.
The difficulties of controlling future superdrones in the face of Iranian or perhaps Chinese electronic warriors inevitably generated speculation about the onset of “autonomous” systems capable of conducting a mission without human intervention and without command links vulnerable to hacking. Indeed, the navy’s demonstrator drone that managed two carrier landings (out of four attempts) in a flat, calm sea in July 2013 was autonomous, flying only under the direction of its onboard computers. The challenge of landing on a pitching, rolling deck, something that requires intense training for humans to accomplish, has yet to be faced. Nevertheless, the supposed imminence of robotic systems endowed with the ability and power to make lethal decisions has become a recurring topic of concern among human rights activists, complete with TED talks about the near-term probability that “autonomous military robots will take decision making out of the hands of humans and thus take the human out of war, which would change warfare entirely.” In November 2012, Human Rights Watch called for a “preemptive ban on the development, production, and use of fully autonomous weapons.” Naturally, in view of the money to be made, interested parties have been eager to bolster the notion that such systems are a practical possibility. As David Deptula said of drone video analysis: “Making this automatic is an absolute must.” The Office of Naval Research has even funded a joint project by several major universities to “devise computer algorithms that will imbue autonomous robots with moral competence—the ability to tell right from wrong.” This was clearly destined to be a multiyear contract, since, as one sympathetic commentator noted, “[S]cientifically speaking, we still don’t know what morality in humans actually is.”
Early experiments appeared to confirm that autonomous drones, ethical or otherwise, might be just around the corner. An experiment involving two small drones with computers that process images from onboard cameras reportedly managed to locate and identify a brightly colored tarp spread out in an open field. “The demonstration laid the groundwork for scientific advances that would allow drones to search for a human target and then make an identification based on facial-recognition or other software,” explained the
Washington Post
confidently. “Once a match was made, a drone could launch a missile to kill the target.”
Picking out a brightly colored object with sharp edges against a plain background is in the grand tradition of budget-generating Pentagon tests. (Infrared systems are usually tested in the early morning, for example, so that the warm target-object shows up nicely against the ground, still cool from the night air.) In the real world, where edges are not sharp and shades of gray are hard to differentiate, not to mention the shifting silhouette of a human face, life becomes a lot more difficult, especially if the target is taking steps to stay out of sight. (According to an al-Qaeda tip sheet discovered in Mali, Osama bin Laden had advised his followers to “hide under thick trees” as one of several sensible suggestions for evading drones.) Given the difficulty humans face in making correct decisions on the basis of ambiguous electro-optical and infrared images of what may or may not be an enemy (is that a squatting Pashtun?), not to mention the ongoing and oft-lamented failure to get computers to analyze surveillance video, such anxiety might be premature. Exponentially increasing computer processing power has kept alive the dream of artificial intelligence, founded on the belief that the brain operates just like a computer through a series of on-off switches and that therefore a computer is capable of performing like a human brain. But it has become clearer that the brain does not operate in any such fashion but rather, as Berkeley philosopher Hubert Dreyfus has long maintained, on intuitive reactions based on accumulated expertise and intuition, not on the mechanistic process, characteristically evoked by Votel, of “connecting the dots.”
In one sense, however, the system is already “autonomous.” On the eve of World War II, air force planners identified the few targets they needed to destroy to bring Germany to its knees. The plan did not work, targeting committees met, their target lists expanded, the enemy adapted to each list change, and the war dragged on for year after bloody year. Nevertheless, the same strategy was followed in Korea, Vietnam, and Iraq, each time with more elaborate technology. Ultimately the technology offered the promise of destroying not just the physical objects—power plants, factories, communications, roads, and bridges—that sustained the enemy but also selected individuals, duly listed in order of importance, who controlled the enemy war effort. By 2014, with Afghanistan sinking back into chaos and the jihadis’ black flag waving over ever-larger stretches of the globe under the aegis of a leader, Abu Bakr al-Baghdadi, more capable and successful than his targeted predecessors, it was clear that this latest variation of the strategy had also failed. Michael Flynn, formerly McChrystal’s intelligence officer in the hunt for Zarqawi who had gone on to command the Defense Intelligence Agency, ruefully admitted as much as he prepared to leave office, telling an interviewer, “We kept decapitating the leadership of these groups, and more leaders would just appear from the ranks to take their place.”
Flynn’s insight made no difference. President Obama, claiming success in assassination campaigns in Pakistan, Yemen, and Somalia (where a recently assassinated leader had been immediately replaced), pronounced that this newest threat would be met with the same strategy. Predators, Reapers, and Global Hawks accordingly scoured the desert wastes of Iraq and Syria, beaming uncountable petabytes of video back up the kill chain. Among the recipients were four hundred members of the Massachusetts National Guard sitting in darkened rooms at a base on Cape Cod, gazing hour after hour at blurry images in search of “patterns of life” that might denote the elusive enemy. “None of them are on the ground, and none of them are in the theater of operations,” said their local congressman proudly, “but they are contributing from here, conducting essential frontline functions.”
As David Deptula promised that “with a more intense campaign” victory would come quickly, enemy leaders switched off their cell phones and faded from view. Pentagon officials demanded more spending. Wall Street analysts hailed the prospect of “sure-bet paydays” for drone builders and other weapons makers. The system rolled on autonomously—one big robot mowing the grass, forever.
The page numbers for the notes that appeared in the print version of this title are not in your e-book. Please use the search function on your e-reading device to search for the relevant passages documented of discussed.
Please note that some of the links referenced in this work are no longer active.
1 | Remember, Kill Chain
In a cold February dawn in 2010: The description of events in this chapter is drawn from U.S. Central Command, “AR16-6 Investigation, 21 February 2010, U.S. Air-to-Ground Engagement in the vicinity of Shahidi Hassas, Uruzgan District, Afghanistan.”
https://www.aclu.org/drone-foia-department-defense-uruzgan-investigation-documents
. The report was originally released following an FOIA request by
Los Angeles Times
reporter David S. Cloud.
2 | Wiring the Jungle
Asked who the enemy was: Personal investigation by Leslie Cockburn, who led an ABC News team to the area in 1994.
The scheme had been conceived far away: Anne Finkbeiner,
The Jasons: The Secret History of Science’s Postwar Elite
(New York: Viking Penguin, 2006), p. 92.
On the eve of World War II: Charles R. Griffith
, The Quest, Haywood Hansell and American Strategic Bombing in World War II
(Maxwell AFB, Alabama: Air University Press, 1999), p. 70.
Early in 1966 air force planners believed they had identified the “critical node”: Bernard C. Nalty,
The War Against Trucks
(Washington, DC: U.S. Air Force, Air Force History and Museums Program, 2005), p. 7.
To process the data Garwin, the IBM scientist: Finkbeiner, op. cit., p. 100.
Ensconced in Santa Barbara: Ibid., p. 97.
Their preferred choices: Ibid., pp. 100–101.
“On the battlefield of the future”: The full text is carried in the appendix of Paul Dickson’s amazingly percipient book
, The Electronic Battlefield
(Bloomington, IN: University of Indiana Press, 1976).
Marshall Harrison, a former high school teacher: Marshall Harrison,
A Lonely Kind of War, Forward Air Controller Vietnam
(Bloomington, IN: Xlibris Corp., 2010), pp. 106–107.
“Just as it is almost impossible…”: Dickson, op. cit., p. 22.
“after analyzing various names of insects and birds”: Thomas P. Ehrhard,
Air Force UAVs: The Secret History
(Washington, DC: Mitchell Institute, 2010), fn. 159, p. 66.
In World War II the U.S. Navy had brought about the death: Jack Olsen,
Aphrodite, Desperate Mission
(New York: G. P. Putnam’s Sons, 1970), p. 224.
Come the Vietnam War, they were adapted for reconnaissance: Ehrhard, op. cit., p. 20.
The raids were therefore conducted in deepest secrecy: Department of Defense, “Report on Selected Air and Ground Operations in Cambodia and Laos, Sept. 10, 1973.”
http://www.dod.mil/pubs/foi/International_security_affairs/vietnam_and_southeast_asiaDocuments/27.pdf
. Accessible via Google.
Back in Santa Barbara, the Jasons had entertained: Finkbeiner, op. cit., p. 101.
“We spent seven days trying to arrive at a solution”: James Zumwalt,
Bare Feet, Iron Will—Stories from the Other Side of Vietnam’s Battlefields
(Chantilly, VA: Fortis Publishing Co., 2010), p. 258.