After the Jan. 6 Capitol riot, Facebook META -0.77%decrease; red down pointing triangle parent Meta Platforms Inc. said it wanted to scale back how much political content it showed users. The company went further than almost anyone knew.
The results of the effort are gradually reshaping political discourse on the world’s biggest social-media platform, even though the company backed off the most aggressive approach: hitting the mute button on all recommendations of political content.
The company’s sometimes tortured efforts over the past 18 months to play down politics and other divisive topics on the platform are outlined in internal documents reviewed by The Wall Street Journal.
At first, Facebook overhauled how it promoted political and health-related content. With surveys showing users were tired of strife, the platform began favoring posts that users considered worth their time over ones that merely riled them up, the documents show. Debates were fine, but Facebook wouldn’t amplify them as much.
Meta’s leaders decided, however, that wasn’t enough. In late 2021, tired of endless claims about political bias and censorship, Chief Executive Mark Zuckerberg and Meta’s board pushed for the company to go beyond incremental adjustments, according to people familiar with the discussions. Presented with a range of options, Mr. Zuckerberg and the board chose the most drastic, instructing the company to demote posts on “sensitive” topics as much as possible in the newsfeed that greets users when they open the app—an initiative that hasn’t previously been reported.
The plan was in line with calls from some of the company’s harshest critics, who have alleged that Facebook is either politically biased or commercially motivated to amplify hate and controversy. For years, advertisers and investors have pressed the company to clean up its messy role in politics, according to people familiar with those discussions.
It became apparent, though, that the plan to mute politics would have unintended consequences, according to internal research and people familiar with the project.
The result was that views of content from what Facebook deems “high quality news publishers” such as Fox News and CNN fell more than material from outlets users considered less trustworthy. User complaints about misinformation climbed, and charitable donations via the company’s fundraiser product through Facebook fell in the first half of 2022. And perhaps most important, users didn’t like it.