
It’s a scheme straight out of science fiction: Build “a virtual, centralized grand database” at the Defense Department, one containing every available shred of personal information, and then “mine” it for indications of potential terrorist activity. Yet this Orwellian suggestion, which has stirred warnings from civil liberties groups and die-hard conservatives alike, is all too real–or at least it will be, if the Bush administration’s national security planners get their way.
Announced late last year, the project has been judged by its many critics to be either too weird or too dangerous, and more than likely both. Total Information Awareness (or TIA), as it is called, was hatched in the post-9/11 Pentagon, and is regarded as one of the most ambitious surveillance projects in history. Given the sprawling reach of computer databases and networks–coupled with the push to strip away restrictions on government spying on Americans through legislation like the USA PATRIOT Act–the program “could be the perfect storm for civil liberties in America,” as one privacy advocate recently put it.
The project’s expansive and invasive mandate isn’t the only thing that has critics up in arms. TIA is directed by retired Adm. John Poindexter, of Iran-Contra scandal fame. Having been convicted of lying to Congress when he was President Reagan’s national security adviser, Poindexter’s just about the last person most Americans would want poking through their credit card receipts. Then there was the TIA’s infamous logo: It depicted an all-seeing eye perched above a pyramid, a symbol that seemed tailor-made for depicting the worst sort of government snooping. In December 2002, amid a brewing public outcry, the logo was removed from the TIA Web site and replaced with a less startling one.
Presently, the project is technically on hold, pending a Congressional review of its more controversial aspects. But the planning is still fully under way, and the dangers of an all-knowing government, powered by supercomputers crunching our voluminous personal data, assuredly remain.
So says Christian Stalberg, chairman of the Research Triangle Park chapter of Computer Professionals for Social Responsibility (CPSR). The organization is “principally concerned with the use of technology in its correct and incorrect uses, and the potential of it being abused for unsavory methods,” explains Stalberg, who works for the state’s Office of Information Technology Services. “It’s our feeling that just because our technology is capable of doing something, that doesn’t mean we should do it. We’re in the midst of an information revolution, and we need to see what the impacts could be on our society and our civil rights.”
Last Friday, CPSR hosted a forum at Duke University’s John Hope Franklin Center that brought together key local and national critics of TIA and like-minded initiatives. It will take the best and the brightest in the information technology industry to make Poindexter’s project operational (and even that may not be enough), but it was evident at the forum that some top computing specialists will opt out of building a super-surveillance system, on moral grounds.
David Sobel, general counsel for the Electronic Privacy Information Center in Washington, D.C., warned that new government surveillance efforts cast far too wide a net, in what amounts to a counter-terrorist fishing expedition. “Starting with no suspicion concerning a particular individual, but somehow divining from this mass of information that this is in fact somebody who warrants the government’s attention–that approach is a fundamental departure from the presumption of innocence that has always characterized our criminal justice system,” he said.
James Boyle, a Duke law professor who specializes in information-age legislation, agreed, saying that the new approach represents “a vision of everyone potentially being a suspect” and “a move to administer by profile.”
Still, he and others at the forum noted, the day of totalitarian technology may not yet be upon us, since some of Poindexter’s grander objectives could prove to be unworkable. Consider a few titles of TIA subprojects, which border on the esoteric: “Extensible Probabilistic Repository Technology”; “Human Augmentation of Reasoning through Patterning”; “New Representation for Dynamic Force Properties for Use in Collaborative 3-D Battlespace Displays.”
Despite all the smart-sounding jargon and the wealth of Pentagon research dollars, “they almost need miracles to happen” for TIA to actually help ferret out terrorists, argued Gregory Newby, a professor at UNC-Chapel Hill’s School of Information and Library Science, who has worked on government national security contracts. TIA is based on counterproductive assumptions, he said, because the “noise” of so much data will bury the “signals” that matter most. Along the way, Newby said, “you’re going to get a lot of false positives”–and run the risk of rounding up the wrong people. “The stuff that doesn’t work is going to greatly outweigh the stuff that does work.”
But even the skeptics said that citizens should pay close attention to anti-terrorist initiatives, especially those that funnel personal information onto government hard drives to be sifted by the military and law enforcement. Boyle, who called himself “a reluctant paranoid,” said that given the tenor of the times and the aims of TIA, “a little paranoia might be well advised.”
The RTP chapter of Computer Professionals for Social Responsibility will post video clips of the surveillance forum at their Web site: www.rtp.nc.us.