Last week, five activists in Vermont sued the U.S. Immigration and Customs Enforcement immigration agency for violating their First Amendment rights, alleging that the agency used techniques typically reserved for disrupting organized crime networks to track their movements for years.
Despite a Department of Homeland Security policy that supposedly prioritized enforcement against immigrants with serious criminal records, the plaintiffs allege that ICE targeted them for deportation and surveilled their farmworker rights group, Migrant Justice, planting an informant in the group, attempting to hack into the email accounts of its members, and compiling detailed dossiers on their activities and social circles.
ICE’s actions “chilled and continue to chill Migrant Justice’s ability to organize and advocate for migrant workers,” say court papers, which were filed in a Vermont federal court by the state’s ACLU chapter and other groups.
To conduct its surveillance, the plaintiffs say ICE relied on a growing trend in law enforcement that has privacy and immigration experts worried: social media data mining and other high-tech surveillance tools that make it easier to pry into individuals’ personal lives. Built by private contractors with minimal transparency, millions of dollars’ worth of sophisticated software and databases are being used to search for individuals who are suspected of neither terrorism nor violent crimes.
“Governments have long outsourced their dirty work to companies,” says Sara Nelson, a communications officer for Privacy International, a Britain-based watchdog group that has tracked the growth of companies that supply vast amounts of data and analyses to clients—government agencies, corporate brands, or both. “ICE couldn’t be effective without the slew of companies propping the agency up with data, software, and infrastructure,” Nelson told me.
In August, amid outrage over the Trump administration’s so-called “zero tolerance” policy toward immigrants, including the policy of separating children from their parents at the border, Privacy International released a report detailing the various ways that companies contract with ICE to deliver unprecedented streams of data. This includes people’s likely locations, car registrations, and social media activity—data that “can be used by the agency and others to identify and track people and their families, including for deportation.”
The report highlighted ICE contracts with five companies that provide software and “invasive and sweeping data about people”: Giant Oak, Palantir, T-Rex Consulting Corporation, and two subsidiaries of Thomson Reuters. Contracts held by Palantir and others were the focus of a separate report issued last month by immigrant rights groups. The report, titled “Who’s Behind ICE?: The Tech and Data Companies Fueling Deportations,” also examines the role of Amazon Web Services, which provides cloud infrastructure for many defense-and security-related purposes. (See a video below.)
Jacinta Gonzalez, the field organizer for immigrant advocacy group Mijente, one of the report’s sponsors, told Fast Company‘s Sean Captain that the amount of data collected by ICE was arresting. “People on the ground have been more and more [saying to us] ‘How do they have information about my taxes?’ How do they have information about where I drive my car?'”
Sherry Lauren Forbes, a former social scientist at Giant Oak, says ICE’s use of data analysis software can enhance public safety, helping make lawful intelligence activities more efficient and effective.
“Law enforcement case officers are extremely burdened and overworked by the amount of information that they have to sift through,” she told me in a phone call. “Politics aside, there are dangerous people in this country. ICE needs to find them, particularly their CTCEU [Counterterrorism and Criminal Exploitation Unit], a division tasked with finding these people. It’s really tough. They hide. They’re often not in the public records, in the domains where you would normally think, but maybe they’re on the internet somewhere.”
Previously, when cases focused on specific individuals would come across an ICE investigator’s desk, the investigator might perform some keyboard searches on Google, sift through many unrelated results, then cut and paste these results into a Word document, she says. They would then forward that document to a law enforcement officer, who would then try to physically locate individuals.
“Often at that point they were gone, you couldn’t find them,” says Forbes, who currently works in machine intelligence at Cambridge, Massachusetts-based Draper. “So, a lot of times the companies offering services to ICE are trying to be able to let them do this much faster, and find these dangerous people.”
Forbes emphasized that ICE is using Giant Oak’s software for specific cases, not to scour the web and databases for every single person suspected of being in the U.S. illegally. Still, she says, Privacy International’s report highlighted the need for a public debate about the technology and the data itself, and who does what with it.
“I’ve had these conversations with my peers at many different companies and government agencies. I can tell you from my perspective that people are very concerned about the ethical implications,” she says. “People are very aware of it.”
GOST in the machine
Giant Oak, which is based in Arlington, Virginia, sells ICE access to a software platform that seeks to find “the people behind the data.” Called Giant Oak Search Technology, or GOST (pronounced “ghost”), the software trawls through a broad array of data sources to find information about persons of interest. The company says on its website that the system is used by governments and financial institutions to combat “global terrorism, transnational criminal organizations, human trafficking, [and] money laundering,” and to help brands monitor the web for “negative media.”
The firm has also been working with ICE at least since 2014, on contracts worth nearly $45 million, Privacy International reported. In September 2017, the same month that the Department of Homeland Security, ICE’s parent agency, articulated a policy of collecting and studying the social media data on all immigrants, Giant Oak won nearly $3 million in ICE contracts.
The ICE projects weren’t a complete novelty: Gary Shiffman, Giant Oak’s CEO, told a reporter last year that the company had assisted ICE in conducting social media vetting for years.
Privacy International says the company declined to answer its questions about how ICE is using the company’s software. A Giant Oak spokesperson declined to comment on the Privacy International report, and Shiffman was unavailable for an interview.
In October 2016, Shiffman explained to the website Nextgov that ICE’s Homeland Security investigations use GOST to comb through social media sites, government databases, and public indices to determine which individuals are visa violators—that is, those who have overstayed the time period laid out on their visa. While these individuals may be violating U.S. immigration law, they needn’t be violent criminals or terrorists for GOST to identify them. Targets could very well be people who are working in the U.S., who have families, and are separated at detention centers.
Formed in 2013, the company has close ties to the government. Shiffman, a Navy veteran who has taught at Georgetown University, is a former chief of staff at U.S. Customs and Border Protection (CPB), ICE’s sister agency. Giant Oak’s software also traces its origins to work that Shiffman did with DARPA, including Nexus 7, a controversial big data initiative designed to produce intelligence on Afghanistan’s population and its culture, as Wired reported in 2011.
To Nelson of Privacy International, the company’s software poses nearly unavoidable risks to individuals’ digital lives, whatever the immigration policy.
“Giant Oak’s business model is a threat to people’s privacy because it exploits social media companies’ default public settings and people’s comfort sharing personal data openly online,” she told me. “Social media companies have so far failed to implement basic privacy good practices such as by default making all content private, rather than public.”
How Palantir and Thompson Reuters work with ICE
Palantir, whose chairman Peter Thiel was a Trump campaign supporter and is a member of Facebook’s board, has secured at least $1 billion in federal contracts since 2009, and supplies ICE’s $41 million investigative case management system. As The Intercept reported in 2017, the ICM gives ICE agents access to a “vast ‘ecosystem’ of data to facilitate immigration officials in both discovering targets and then creating and administering cases against them.”
Palantir’s 2014 ICM proposal described a system originally intended for use by the Homeland Security Investigations directorate. Two years later, a Privacy Impact Assessment by DHS noted that personnel within ICE Enforcement and Removal Operations use ICM “to manage immigration cases that are presented for criminal prosecution,” but also “to query the system for information that supports its civil immigration enforcement cases.” The company declined to speak on the record about its contracts with ICE and other government agencies.
In a FOIA lawsuit filed against ICE last year, the nonprofit Electronic Privacy Information Center warned that both the ICM and another piece of Palantir software, FALCON, “pose significant threats to privacy.”
“Both systems collect a significant amount of personal information,” EPIC says, “both are exempt from many of the protections of the Privacy Act, and both disseminate personal data broadly among other government and law enforcement agencies—any of which could use the information as a reason to subject an individual for scrutiny.”
This summer, a group of 450 Amazon employees asked CEO Jeff Bezos to stop working with Palantir, according to a letter published recently on Medium by an anonymous Amazon employee. The open letter called for the company to also stop selling its face recognition system to law enforcement agencies. Still, according to a report this month in the Daily Beast, Amazon pitched face recognition to ICE as recently as June.
Two subsidiaries of Thompson Reuters are involved in ICE’s data gathering apparatus. While Reuters reporters have covered the harsh impact of Trump immigration policies, Privacy International notes that its sibling company Thomson Reuters Special Services has made more than $16 million in its work for ICE.
One February 2018 contract worth $6.7 million calls for “a continuous monitoring and alert service that provides real-time jail booking data to support the identification and location of aliens.” As Reveal from the Center for Investigative Reporting reported last year, the system ICE sought would be capable of keeping 500,000 unauthorized immigrants under continuous surveillance, with data including their phone numbers, places of employment, insurance claims, and payday loans.
Another Thompson Reuters subsidiary, West Publishing, has made tens of millions selling software to ICE, including the Westlaw PeopleMap and a system called Consolidated Lead Evaluation and Reporting, or CLEAR, which allows ICE to access a “vast collection of public and proprietary records.” According to Privacy International, this includes phone records, consumer and credit bureau data, healthcare provider content, utilities data, DMV records, World-Check listing (a metric from Thompson Reuters’ risk intelligence database), business data, data from social networks and chatrooms, and “live access” to more than 7 billion license plate detections.
Stephen Rubley, the CEO of Thompson Reuters Special Services, defended the company’s work with ICE, in response to a June letter from Privacy International. He wrote that the company “support[s] the rule of law” and ensures its customers have “specific legally permissible uses prior to being granted access to any data.” TRSS’s products, he says, “are not used by the Border Patrol Division for purposes of patrolling the border for undocumented immigrants or their detainment.”
Rubley—who also sits on the board of the ICE Foundation, one of many industry executives with ties to the ICE-connected charity, as Sludge previously reported—did not address West Publishing’s contracts with the agency.
“A new age of information availability”
Forbes, the former Giant Oak researcher, says that many scientists in the industry were aware of the risks of big data. Adding to privacy and security concerns surrounding personal data is a growing reliance on machine learning algorithms to mine and draw conclusions from it. A growing body of research has shown that biases in AI can lead to unfair outcomes, especially among persons of color. “This is what social scientists and computer scientists are actively trying to do—we are working to make these kinds of projects better,” says Forbes.
Meanwhile, she says, policy makers and the public at large will need to reckon with larger questions about privacy, and how the technology is deployed.
“[W]e’re entering a new age of information availability, and we have yet to agree as a society what kind of data we want to consider private, and what kind of data we want to consider as up for grabs, what kind of data we feel comfortable having automatic decisions or scores calculated for us with, etc. This is where there needs to be debate; and if we as a society decide this issue needs to be regulated, there needs to be more discussions on how best to regulate it.”
For now, public discussion remains difficult, privacy experts say, without stronger transparency. Nondisclosure agreements prevent government agencies from sharing information about a tech company’s proprietary technology and trade secrets, while government-side NDAs prevent companies from talking about non-public, sensitive, or classified information they are privy to through contracts.
Given the secrecy surrounding the use of software and big data, Nelson, of Privacy International believes that the companies involved should bear greater responsibility for how their technologies are designed and to whom they’re sold.
“It is not enough to say that companies will build things, and politicians deal with the consequences,” she says. “Companies that develop and sell these types of technologies have a responsibility to build and sell technologies responsibly, and to think about the consequences of their tech.”