Meta Admits It Erred in Leaving Up Content on Facebook that Encouraged Violent Protests Following Brazil’s Election; Oversight Board Calls for Public Comment

This seems to be an important development:

On January 3, 2023, two days after Luiz Inácio Lula da Silva had been sworn in as Brazil’s president, a Facebook user posted a video with a caption in Portuguese. The caption includes a call to “besiege” Brazil’s congress as “the last alternative.” The video shows part of a speech given by a prominent Brazilian general and supporter of Lula’s electoral opponent, in which he calls for people to “hit the streets” and “go to the National Congress… [and the] Supreme Court.” A sequence of images follows the general’s speech, including one of a fire raging in the Three Powers Plaza in Brasília, which houses Brazil’s presidential offices, Congress, and Supreme Court. Text overlaying the image reads, “Come to Brasília! Let’s Storm it! Let’s besiege the three powers.” Text overlaying another image reads “we demand the source code,” a slogan that protestors have used to question the reliability of Brazil’s electronic voting machines. The video was played over 18,000 times, was not shared, and was reported seven times.   

Mr. Lula da Silva’s swearing-in had been accompanied by civil unrest, including protests and roadblocks. On January 8, more than a thousand supporters of former president Jair Bolsonaro broke into the National Congress, Supreme Court, and presidential offices, intimidating the police and destroying property. Meta designated Brazil a temporary high-risk location ahead of the country’s October 2022 general election, and has been removing content “calling for people to take up arms or forcibly invade …federal buildings” as a consequence. Meta only announced it had done so on January 9.  

On the same day the content was posted, a user reported it for violating Meta’s Violence and Incitement Community Standard, which prohibits calls to “forcibly enter locations … where there are temporary signals of a heightened risk of violence or offline harm.” In total, four users reported the content seven times between January 3 and January 4. Following the first report, the content was reviewed by a human reviewer and found not to violate Meta’s policies. The user appealed the decision, but it was upheld by a second human reviewer. The next day, the other six reports were reviewed by five different moderators, all of whom found that it did not violate Meta’s policies. The content was not escalated to policy or subject matter experts for additional review.  

One of the users who had reported the content appealed Meta’s decision to the Oversight Board. In their appeal to the Board, they link the content’s potential to incite violence to the movement of people in Brazil “who do not accept the results of elections.”  

The Board selected this case to examine how Meta moderates election-related content, and how it is applying its Crisis Policy Protocol in a designated “temporary high-risk location.” Meta developed the Protocol in response to the Board’s recommendation in the “Former President Trump’s suspension” case. This case falls within the Board’s “Elections and civic space” priority.  

As a result of the Board selecting this case, Meta determined that its repeated decisions to leave the content on Facebook were in error. Because at-scale reviewers do not record their reasons for making decisions, the company does not have further information about why they found the content did not violate its policies in this case. On January 20, 2023, Meta removed the content, issued a strike against the content creator’s account, and applied a feature-limit, preventing them from creating new content. 

The Board would appreciate public comments that address: 

  • The political situation in Brazil in advance of October’s election, and how it shifted between October 2022 and January 8, 2023. 
  • The relationship between political violence, election denialism, and calls for offline mobilization on social media. 
  • When Meta’s election integrity efforts should begin and end, and what criteria should guide decisions about those timeframes, particularly as they relate to transitions of power.  
  • How Meta should distinguish between legitimate political organizing and harmful coordinated action. 
  • How Meta should treat content attacking or delegitimizing democratic institutions and processes.  

In its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case. 

Share this: