Advertisers know what we want to buy. Google knows what we want to search. Will the state soon know—or think it knows—what we intend to do?
In Predicting and Preventing Crimes—Is Minority Report the Next Step? Jon L. Mills, a professor of law at the University of Florida, says that the era of pre-crime surveillance may soon be upon us. He documents the dangerous forays currently being made that could ensure a future reminiscent of the 2002 film Minority Report (based on the Philip K. Dick short story of the same name), in which clairvoyant oracles—“precogs”—dispatch police to stop potential criminals before they even have a chance to earn the label.
But Professor Mills—whose writing appears in the forthcoming essay collection, After Snowden: Privacy, Secrecy and Security in the Information Age—need not look that far over the horizon.
On May 12, a New York-based federal prosecutor asked the Second Circuit Court of Appeals to reinstate the conviction of Gilberto Valle, the so-called “Cannibal Cop.” Valle, a former New York police officer, had his name—or, rather, his newfound moniker—splashed across tabloids the world over when prosecutors alleged that he had concocted a plan to kidnap, sexually torture, murder, and eat several women. In 2013, a jury convicted him of conspiracy to commit kidnapping, based primarily on macabre Internet chats he shared with two other individuals he met on the website DarkFetishNet.
In a highly unusual move, the trial judge subsequently overturned the jury verdict, finding that the evidence didn’t support the conviction. In essence, the judge determined that the jury, responding to the gruesome nature of Valle’s fantasies, had essentially convicted him of thought crimes. Valle’s fate is now in the hands of the Second Circuit.
***
The case has become a centerpiece of online debate over free speech.
The prosecution alleged that his plans were real. The defense countered that he was merely a “death fetishist” who, together with like-minded individuals, was constructing fantastic virtual narratives peppered with real-life details to make them sound more genuine.
His chats, by all accounts, made the Marquis de Sade look like Mary Poppins. But did he cross the line from fantasy into criminal conduct?
The jury may have thought so, but the judge wasn’t buying it. To prevail, prosecutors would have had to prove that Valle took “overt acts” in furtherance of his conspiracy. Merely discussing the intended kidnappings with another party–even in detail—wouldn’t cut it if Valle lacked the basic materials to realize his dark visions. But prosecutors claimed that he had searched Google (“how to make chloroform”), had drawn up a blueprint for one of the kidnappings, and had surveilled at least one intended victim: a friend from college whom he had visited with his family.
***
Are Internet Searches “Overt Acts?”
Search engines utilize popular search terms to complete our thoughts for us. If an intriguing term pops up and we click it, are we culpable? Searches can even, at times, serve as expressive outlets—the defense noted that Valle’s wife, who discovered his chats using spyware and handed them over to authorities, had once searched the phrase: “my husband doesn’t love me.”
A documentary on Valle’s travails, Thought Crimes: The Case of the Cannibal Cop, premiered this year at the TriBeCa Film Festival. As one interviewee asserted in the film, Valle is “not just the cannibal cop. He’s patient zero in the Thought Police epidemic that can sweep the nation.”
With that in mind, read on to see what the real future may hold. And watch what you search.
WhoWhatWhy Introduction by Christian Stork
***
Excerpt from “After Snowden. Privacy, Secrecy and Security in the Information Age.” A collection of essays by various distinguished authors, edited by Ronald Goldfarb
The excerpt—which we have edited to meet space requirements—is from the essay, “The Future of Privacy in the Surveillance Age, VI. Predicting and Preventing Crimes—Is Minority Report the Next Step?” by Jon L. Mills, Dean Emeritus, Professor of Law, and Director of the Center for Governmental Responsibility at the University of Florida’s Fredric G. Levin College of Law.
In the wake of some of the tragedies documented by social media in recent years, officials discovered confession videos and Facebook entries that either predicted or threatened the tragedy to come: students promised to hurt others or themselves, or worse, cried out for help when none was forthcoming.
Sometimes the documentation simply suggests the pending tragedy by showing pieces of it coming together: a search for how to build a bomb coupled with a Google map result for street views of a city park. If the government has enough information to indicate an individual is going to bomb the Boston Marathon, should the authorities stop him before he does it?
“We have a system of pervasive, pre-criminal surveillance where the government wants to watch what you’re doing just to see what you’re up to, to see what you’re thinking, even behind closed doors,” said
Edward Snowden in 2014.
Predicting future human behavior is increasingly valued, particularly as understanding consumer behavior becomes the holy grail of Internet search engines and marketers. The defense contractor Raytheon has developed a program that uses social networking data to track people’s movements and predict future behavior. The Rapid Information Overlay Technology (RIOT) uses GPS from photographs posted on Facebook and Foursquare check-ins to determine where the person has been and where he or she will likely go in the future. Raytheon has shared its technology with the US government.
Amazon recently gained a patent for anticipatory shipping. The company is so sure about what we are going to buy that it plans to ship it to us before we order it. (US Patent 8,615,473) Why not use the same tools to predict and thereby prevent criminal behavior?
Of course trying to punish individuals before they actually commit a crime is tricky business. Thankfully we are not living in the sci-fi dystopia of Steven Spielberg’s film Minority Report. The 2002 film portrayed a United States in 2054 where “precrime police” in D.C. worked with preventive government to protect citizens. They stopped murders before they happened with the help of “precogs” and computers.
As long as the “precrime” system was 100 percent correct, the public supported it and there seemed to be no more murders. When the fictional system in Minority Report was found to be flawed, it collapsed. Using the name “Minority Report” is simply a reminder that perfection is elusive.
If predictive technology is deemed accurate, then most people would likely support employing it to prevent crime. The issue then becomes the degree of accuracy of the prediction.
The Department of Homeland Security developed Future Attribute Screening Technology (FAST) and tested it publically in 2011. FAST uses sensors and video and audio recordings to assess the probability that an individual—not yet suspected of any crime—will commit a crime in the future.
These sensors and recordings evaluate an individual’s psychophysiological signals to determine malintent. This behavioral biometric data includes cardiovascular signals, pheromones, skin conductivity, eyeblink rate, and respiratory patterns.
The details of the field test have not been disclosed, but a 2011 Privacy Impact Assessment discussed limited operational tests where volunteers would undergo screenings (image alone, questions alone, or images and questions combined). While the DHS did not reveal the results of the public test, it has reported a success rate of 70 percent in its lab tests.
Although the Snowden disclosures did not focus on any specific predictive preventive programs, another NSA employee revealed the NSA program AQUAINT (Advanced Question Answering for Intelligence) in 2009. This system uses data shared on the Internet (and likely already collected by the NSA) to answer predictive questions about future events.
In addition, the Center for Advanced Study of Language (CASL) has initiated a program that seeks to determine whether a person is lying by studying his behavior and listening to him speak .CASL is a national security research lab accessible to the NSA.
We are now aware that bulk data collection was used to construct search warrants. The existence of so much data coupled with enhanced predictive coding could move that search warrant up to an arrest warrant.
In a pre-crime analysis, if surveillance of a suspected terrorist operative indicated that he had trained for bomb making, acquired bomb materials, was diagnosed as psychotic, had threatened to bomb a stadium, and had two tickets to a football game for the next day, he would likely be detained.
By the time of the Boston Marathon bombing, that city had a comprehensive CCTV network with some 500 cameras mounted throughout the city. Omnipresent surveillance did not deter Dzhokhar and Tamerlan Tsarnaev from detonating their homemade bombs near the finish line. In the hours following the bombing, local, state, and national intelligence and security officials combed through hundreds of hours of footage captured by CCTV cameras near the bombing site.
This terrorist attack was the first committed in a place with comprehensive surveillance and with the tools to search suspects’ Facebook pages, Twitter feeds, geolocation posts, and blog posts. In addition, thousands of Bostonians and others across the world viewed images and contributed resources through crowdsourcing. Yet the first suspects identified were the wrong people.
A combination of media, technology, and the crowdsourcing element contributed to the wrongful identification of these individuals as the perpetrators; then it published their pictures. Ultimately the terrorists were found and identified.
The Snowden disclosures have opened the programs described above to serious scrutiny. Some of the programs are apparent intrusions and motivate us to analyze rational approaches to reforms that protect privacy and national security. They also challenge us to consider the overzealous extension of modern technology, and whether it might deeply harm our personal freedoms.