Facebook Whistleblower Says Company Dropped Guards Against Political Misinformation In ‘60 Minutes’ Interview
A former Facebook executive criticized the company’s decision to reduce safeguards against misinformation following the 2020 presidential election while on 60 Minutes Sunday evening, her first public appearance since leaking of thousands of pages of internal documents to the Wall Street Journal.
Frances Haugen, 37, said the social network switched off several measures to tamp down on bad content after the election, helping to allow some members of the Jan. 6 riots to plan the insurrection on Facebook. Her time at Facebook, she said, showed her that “there were conflicts with what was good for the public and what was good for Facebook.”
Haugen’s decision to turn whistleblower has catapulted Facebook into a fresh crisis with the documents she provided painting a damning portrait of a company often ignoring its own cautionary internal research. Facebook has sought to aggressively dismiss concerns, often arguing the leaked data is being misinterpreted or a presents only a limited picture of reality. It has shed light on a wide variety of issues at the company—from Instagram’s impacts on teen mental health to Facebook’s algorithimic feed incentizing anger and divisive content.
Haugen worked for two and a half years as a product manager within Facebook’s Civic Integrity team, the unit tasked with monitoring and limiting misinformation on the site. Prior to Facebook, she held roles at Pinterest, Google and Yelp, according to her LinkedIn. “I’ve seen a bunch of social networks and it was substantially worse at Facebook than what I had seen before,” she said.
At Faceboo, the Civic Integrity team played a key role in company’s efforts to get through the presidential election relatively unscathed. But the group was formally dissolved after the election, many of its members reassigned to similar roles on other Facebook teams. “When they got rid of Civid Intregrity,” Haugen said, “that was the moment for me where I was like, ‘I don’t trust them to invest in what they need to to keep Facebook safe.”
Facebook is unwilling to make significant changes to its News Feed algorithim, Haugen said, instead opting to prioritize posts that spark ill will while increasing time spent on the site. The content igniting this hostility has tended to be rooted in conspiracy theories or outright misinforiation. “Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site. They’re click on less ads, they’ll make less money,” she said.
Haugen will next appear before a Congressional hearing on Tuesday, the second time Facebook has faced lawmakers in the past week. Antigone Davis, Facebook’s head of safety, testified in front of a Senate subcommittee on Thursday about children’s safety issues on the company’s apps.