A Facebook psychology researcher who previously helped harvest millions of Facebook users’ profiles for the controversial Trump campaign contractor Cambridge Analytica has left the tech giant.
In December 2015, when it said it first learned that a pair of Cambridge University researchers had sold the harvested user data to the election firm, Facebook began an investigation and demanded the data be deleted. Around the same time, it also hired Joseph Chancellor, one of the two researchers.
On Wednesday, a Facebook spokesperson declined to explain when or why Chancellor had left the company, or to detail the results of any investigation into his work. “I can confirm that Joseph Chancellor is no longer employed by Facebook, and we wish him all the best,” she wrote in an email. Chancellor, whom the company said worked as a quantitative researcher on its User Experience Research team, did not respond to a request for comment.
Facebook has said little in response to lawmakers’ questions about Chancellor, who was also not mentioned on Wednesday when Senators questioned Facebook COO Sheryl Sandberg at a hearing on Capitol Hill. The company has not publicly held any employee accountable for the Cambridge Analytica episode, but after it became front page news, Chancellor was put on leave by Facebook as it investigated his role, BuzzFeed reported in May. As recently as June, the company said that it was still “investigating Mr. Chancellor’s prior work” according to written responses to Senators’ questions.
On Sunday, 60 Minutes first reported that Facebook “no longer employs Chancellor,” in an addendum to an April story about his senior collaborator Aleksander Kogan, the Cambridge University lecturer who has become a focus of the controversy. Kogan and Cambridge Analytica were suspended from Facebook for violating its terms of service in March, when a whistleblower revealed that the Facebook data had, in fact, not been deleted. Chancellor, however, still retains a Facebook account and, as of Wednesday, a company webpage. By Thursday afternoon, that page was no longer active. (The New Yorker also confirmed Chancellor’s departure on Thursday.)
Kogan, who himself has previously collaborated with Facebook on research, officially led the data harvest. But he said he “did everything with” Chancellor. The two were co-founders and equal co-owners of Global Science Research, or GSR, the company that Cambridge Analytica hired to gather the user data and analyze it for psychological traits.
The exact nature of Chancellor’s work at Facebook is not clear. The company employs many in-house social scientists to better understand the psychology of its users using an unprecedented amount of data. When he was hired at Facebook Research in November 2015, according to a now deleted LinkedIn profile, Chancellor had been researching happiness and so-called pro-social behavior. At some point, he was assigned to virtual reality research, a hot topic for the Oculus-owning tech giant.
In April 2017, The Intercept was the first to report on Chancellor’s role at Facebook. At the time, the company said in a statement: “The work that he did previously has no bearing on the work that he does at Facebook.” As for Cambridge Analytica, Facebook told The Intercept then that it believed the harvested data had been deleted, and that an ongoing investigation had “not uncovered anything that suggests wrongdoing.”
Facebook officials have also not made clear when it became aware of his role at GSR or that he had violated its terms of service. The company also has not said what it may have learned from him about the data harvest, either before or after the scandal erupted. Correspondence between Kogan and Facebook in 2014 that has been released to the British Parliament does not mention Chancellor. But in an interview with BuzzFeed, Kogan said that Chancellor had informed the company of his role at GSR in 2015 while he was being interviewed for the Facebook position.
During testimony before a Parliamentary committee investigating fake news and data in elections in April, Facebook’s chief technology officer Mike Schroepfer initially said Facebook had only learned of Chancellor’s role in 2017. Later in the hearing, he revised his answer. “In the recruiting process, people hiring him probably saw a CV and may have known he was part of GSR,” said Schroepfer. “Is it possible that someone knew about this and the right other people in the organization didn’t know about it? That is possible.”
Damian Collins, the chair of the committee, noted the difficulty of Facebook’s position when he questioned Kogan that month.
“When Facebook’s response from their deputy general counsel described your work as ‘a scam and a fraud’ … and they singled you out to say that ‘you’d lied to us and violated our platform policies,’ those remarks must apply to Joseph Chancellor as well,” Collins said.
When asked about the matter in May, a spokesperson for Facebook said that the company didn’t know about Chancellor’s role at GSR at the time he was hired.
“Joseph works in our VR team, and there are probably dozens of other Cambridge University grads that work here at Facebook,” the spokesperson said in a phone call. “When we found out about Kogan’s app back in 2015, and the fact that they might be sharing data with Cambridge Analytica, we didn’t know about Chancellor’s involvement, because Kogan identified himself as the sole developer of the app itself, and didn’t disclose [Chancellor’s] involvement.”
Kogan, who also goes by the last name Spectre, did not respond to requests for comment. He signed a non-disclosure agreement with Facebook in 2016 related to the issue, but Facebook has told lawmakers it has since released him from that agreement. Facebook said in May that Chancellor hadn’t signed an NDA connected to the data harvest, but since he joined Facebook, he has been bound by an employee confidentiality agreement.
While at Cambridge University, where Chancellor worked as a postdoctoral student in Kogan’s psychology lab, the two formed Global Science Research. They built and distributed Facebook apps that offered users personality predictions, based on pioneering research using Facebook “likes” at the university’s Psychometrics Center. In exchange, users would be turning over their personal data and more limited information about their friends–including their “likes” and in some cases their private messages–if their settings allowed it.
Just weeks after founding the company in May 2014, the researchers accepted a contract for $800,000 from Cambridge Analytica to cover costs related to the apps—including paying users $1 to $2 dollars to install them—and to produce data on most American Facebook users and their psychometric scores. They managed to amass information from more than 87 million profiles, even though less than 1% of those people installed the apps to begin with.
Chancellor and Kogan were equal partners at GSR, but Chancellor resigned his directorship in September 2015, company records indicate. Chancellor began his job at Facebook that November, and The Guardian first reported on the data harvest the following month, which is when Facebook says it first learned of the data harvest.
Concerns about GSR’s work first emerged in 2014 at the university. Other researchers there had made a separate tranche of Facebook data available to Kogan and Chancellor “explicitly for academic research purposes only.” But Michal Kosinski, who was then deputy director of the university’s Psychometrics Center, told Fast Company last November that he couldn’t be sure that the center’s own data hadn’t been improperly used by GSR.
“Alex and Joe collected their own data,” Kosinski wrote in an email. “It is possible that they stole our data, but they also spent several hundred thousand [recruiting participants on Amazon’s] Mechanical Turk and data providers—enough to collect much more than what is available in our sample.”
Kogan, whose contract with Cambridge University expires this month, has insisted no academic data was used by GSR. He has also complained of having been “scapegoated” by Facebook, which he said knew what he and Chancellor were doing at the time and had effectively permitted “thousands” of other apps to similarly harvest data.
A university spokesperson said it continues to investigate the issue and has contacted Facebook requesting “all relevant evidence in their possession.”
Cambridge Analytica, with backing from the billionaire Mercer family and Steve Bannon, helped run the Trump campaign’s social media effort alongside representatives from Facebook—so-called campaign “embeds.” SCL Group, Cambridge’s parent company, has said the Facebook data wasn’t used during its work for Trump, and some analysts have said the company’s analysis was largely useless. But its role in elections is now the subject of multiple investigations, and amid the scrutiny, executives have said they are closing the firm.
Facebook, which also faces investigations by the Federal Trade Commission and the Security and Exchange Commission, has since overhauled the systems and policies that allowed Chancellor and Kogan to amass torrents of data, and it has also sought to stem false news, discriminatory ad targeting, and hateful content. But these problems—and Facebook’s subsequent responses—helps illustrate some of the risks that users will continue to face online, and the apparent difficulties internet giants have when it comes to preventing abuse on their platforms.
Chancellor’s fate is now clearer, if not his precise role, but lawmakers, whose patience and technological understanding has already been stretched thin, have many other questions for the social network. As it faces Europe’s General Data Protection Regulation and a new privacy law in California, Facebook and other tech companies are starting to push for a federal law that better meets their wishes. That effort is likely to be shaped by how Facebook answers those lingering questions, even ones as seemingly minuscule as those about a single researcher.
This story has been updated to reflect Facebook’s removal on Thursday of Chancellor’s company homepage, to mention the New Yorker’s report, and to specify Facebook’s minimal answers to lawmakers’ questions about him.