Category Archives: cheap speech

“Meta says it won’t punish Trump for attacking the 2020 election results. But the 2024 vote is a different story”

Oliver Darcy of CNN confirms the point I raised in my Slate piece: Trump can say whatever he wants about the last election without sanction:

Nine minutes after Meta announced that it will allow Donald Trump back on its platforms, the disgraced ex-president was on his own Truth Social app posting about supposed election fraud in the 2020 election.

It’s nothing unusual for Trump. A research report published earlier this month by the watchdog group Accountable Tech found that Trump had written more than 200 posts containing “harmful election-related disinformation” since he was banished from Meta’s platforms.

But now, once again, Trump is Meta’s problem. The social media giant announced on Wednesday, unsurprisingly, that Trump will be permitted back on Facebook and Instagram, setting the stage for some thorny content moderation calls in the weeks, months, and years ahead.

And those content moderation calls are likely to be contentious.

For instance, a Meta spokesperson said Trump will be permitted to attack the results of the 2020 election without facing consequences from the company. However, the spokesperson said, if Trump were to cast doubt on an upcoming election — like, the 2024 presidential race — the social giant will take action. In those cases, Meta might limit the distribution of the violative post or restrict access to advertising tools.

But attacks on the 2020 election will only serve to cast doubt on the integrity of future elections. And Meta will undoubtedly face scrutiny for its high-stakes decisions on the issue as Trump inevitably approaches the line.

Share this:

“Trump’s Evolution in Social-Media Exile: More QAnon, More Extremes”

NYT:

In September, former President Donald J. Trump went on Truth Social, his social network, and shared an image of himself wearing a lapel pin in the form of the letter Q, along with a phrase closely associated with the QAnon conspiracy theory movement: “The storm is coming.”

In doing so, Mr. Trump ensured that the message — first posted by a QAnon-aligned account — would be hugely amplified, visible to his more than four million followers. He was also delivering what amounted to an unmistakable endorsement of the movement, which falsely and violently claims that leading Democrats are baby-eating devil worshipers.

Even as the parent company of Facebook and Instagram announced this past week that Mr. Trump would be reinstated — a move that followed the lifting of his ban from Twitter, though he has not yet returned — there is no sign that he has curtailed his behavior or stopped spreading the kinds of messages that got him exiled in the first place.

In fact, two years after he was banished from most mainstream social media sites for his role in inciting the Capitol riot, his online presence has grown only more extreme — even if it is far less visible to most Americans, who never use the relatively obscure platforms where he has been posting at a sometimes astonishing clip.

Share this:

My New One at Slate: “Meta is Bringing Trump Back to Facebook. It Should Keep Him on a Short Leash to Protect Our Democracy.”

I have written this piece for Slate. A snippet:

Meta’s decision Wednesday to replatform former president Donald J. Trump on Facebook and Instagram is lamentable and ill-considered. The company’s own standards required his continued exclusion from the social media platform so long as he remains a “serious risk to public safety”—and he remains one. After all, it was Trump’s continuing election-denialist rhetoric that apparently led a MAGA-supporting New Mexico candidate last month to mastermind shootings into the homes of Democratic legislators. Millions of Americans continue to believe Trump’s false claim of a stolen 2020 election, and some have taken violent actions and made threats against election workers and others involved in the election process.

The replatforming decision was the latest misstep by a company that “did not even try” to grapple with the risks of election delegitimization in 2020, according to a leaked draft report from the House Select Committee investigating the Jan. 6 attack on the United States Capitol. But rather than wring our hands over mistakes Meta has made, we should focus instead on how the company can minimize the ongoing risk that Trump poses. The key thing that Meta can do now is escalate sanctions against him, such as demoting his content and blocking his expected campaign ads, if he continues to undermine the integrity of American elections….

While Meta has said that Trump will face “heightened penalties” if he breaks the platform’s rules such as by creating a risk of civil unrest, it should go further. Mark Zuckerberg should call Trump directly and warn him that he risks having his posts demoted or removed and his advertising limited if he glorifies or encourages violence, especially election-related violence. We know from the draft report that Zuckerberg has called Trump about specific posts before. Zuckerberg should be firm that sanctions will come if Trump posts anything that could be interpreted as even an implicit threat of violence, given that Trump likes to use innuendo to make his threats.

For example, Trump recently took to posting on the Truth Social platform (in which he has a partial ownership interest) to once again attack Georgia election worker Ruby Freeman. His earlier false claims against Freeman and her mother led to threats of violence against them and a climate of fear for election workers. Meta should not tolerate anything like these posts on Facebook or Instagram.

Second, Meta should demote posts from Trump that engage in election denialism. While Meta has said that it may demote content “that delegitimizes an upcoming election or is related to QAnon,” it does not appear to be willing to take action on what will likely be a core part of what he posts: delegitimation of the last election that will cause continuing harm to faith in our democracy and democratic institutions.

This is a key failing on Meta’s part. Rolling Stone reports that Trump is planning to make his return to major social media platforms with posts about  “rigged elections.” Demotion means that the material would remain visible to people searching for it, but the company’s algorithms would be less likely to suggest the posts into people’s feeds. As a private company, Meta has the right under the First Amendment to promote or demote content as it sees fit. And just as Musk, as owner of Twitter, can decide to replatform white supremacists and Neo-Nazis (as he recently did), Meta can be a more responsible corporate citizen and decline to amplify election lies that threaten violence and undermine democratic institutions.

Third, Meta can renew its commitment to protecting free and fair elections in the United States and around the world. It can begin by beefing up the election integrity team that it partially dismantled after the 2020 elections. The draft report of the Jan. 6 committee describes the weakening of election protections that the company had in place in the past.

There is an urgent need for the restoration of strong election integrity measures. Whether Zuckerberg likes it or not, social media platforms are one of the main ways people communicate about politics and elections around the world. And that means they also become major vectors of election disinformation. That was true not only of the Jan. 6 attack on the Capitol, but also the recent attack on government buildings in Brasilia, following the recent defeat of Trumpian candidate Jair Bolsonaro. As the Times’ Jack Nicas recently reported, the rioting was the result of social-media-fueled “mass delusion” focused on election denialism: “Mr. Bolsonaro’s supporters have been repeating the claims for months, and then built on them with new conspiracy theories passed along in group chats on WhatsApp and Telegram, many focused on the idea that the electronic voting machines’ software was manipulated to steal the election.”

Share this:

“Meta to Reinstate Trump’s Facebook and Instagram Accounts”

NYT:

Just over two years after Donald J. Trump’s accounts were suspended from Facebook and Instagram, Meta, the owner of the platforms, said on Wednesday that it would reinstate the former president’s access to the social media services.

Mr. Trump, who had the most followed account on Facebook when he was barred, will in the coming weeks regain access to his accounts that collectively had hundreds of millions of followers, Meta said. In November, Mr. Trump’s account was also reinstated on Twitter, which had barred him since January 2021.

Meta suspended Mr. Trump from its platforms on Jan. 7, 2021, the day after hundreds of people stormed the Capitol in his name, saying his posts ran the risk of inciting more violence. Mr. Trump’s accounts on other mainstream social media services, including YouTube and Twitter, were also removed that week.

But Meta, which critics have accused of censoring Mr. Trump and other conservative voices, said on Wednesday it had decided to reverse the bans because it had determined that the risk to public safety had “sufficiently receded” since January 2021. The company added that it would add guardrails to “deter repeat offenses” in the future.

“The public should be able to hear what their politicians are saying — the good, the bad and the ugly — so that they can make informed choices at the ballot box,” said Nick Clegg, Meta’s president of global affairs. “But that does not mean there are no limits to what people can say on our platform. When there is a clear risk of real world harm — a deliberately high bar for Meta to intervene in public discourse — we act.”

You can find a link to Facebook’s unfortunate decision at this link. I’ll have more to say about this very soon.

In the meantime, you can read my paper, “Donald Trump Should Remain Deplatformed from Facebook, Twitter, and YouTube Despite the High Bar That Platforms Should Apply to the Question of Deplatforming Political Figures” and my earlier Facebook and Twitter Could Let Trump Back Online. But He’s Still a DangerWashington Post, Mar. 9, 2022.

 In addition, Stanford and UCLA recently held a conference on just this topic. 

Share this:

“Google to stop exempting campaign email from automated spam detection”

WaPo:

Google plans to discontinue a pilot program that allows political campaigns to evade its email spam filters, the latest round in the technology giant’s tussle with the GOP over online fundraising.

The company will let the program sunset at the end of January instead of prolonging it, Google’s lawyers said in a filing on Monday. The filing, in U.S. District Court for the Eastern District of California, asked the court to dismiss a complaint lodged by the Republican National Committee accusing Google of “throttling its email messages because of the RNC’s political affiliation and views.”

“The RNC is wrong,” Google argued in its motion. “Gmail’s spam filtering policies apply equally to emails from all senders, whether they are politically affiliated or not.”

The RNC complaint, filed last October, made clear that Google’s pilot program failed to allay GOP criticism of the company’s spam filters. That criticism mounted last summer amid the party’s disappointing online fundraising performance.

Share this:

Federal District Court in Mackey Case Rejects First Amendment Defense to Criminal Prosecution for Spreading Twitter Memes Falsely Telling Hillary Clinton Supporters They Could Vote by Hashtag/Text

You can find the opinion here (h/t Eugene Volokh).

I write about this case in Cheap Speech. I think, and the district court agrees, that it does not violate the First Amendment to make it a crime to lie about when, where, or how people vote. The Supreme Court so indicated in dicta in the Mansky case. That kind of “false election speech” can be limited consistent with the First Amendment even though laws regulating “false campaign speech” (such as statements that “my opponent voted six times to raise taxes”) likely cannot be.

(I am less sure about whether the existing statute that Mackey is prosecuted under covers this conduct).

Share this:

“Supreme Court Puts Off Considering State Laws Curbing Internet Platforms”

NYT:

The Supreme Court asked the Biden administration on Monday for its views on whether the Constitution allows Florida and Texas to prevent large social media companies from removing posts based on the views they express.

The practical effect of the move was to put off a decision on whether to hear two major First Amendment challenges to the states’ laws for at least several months. If the court ends up granting review, as seems likely, it will hear arguments no earlier than October and will probably not issue a decision until next year.

The two state laws, which are similar but not identical, were largely the product of conservative frustration. The laws’ supporters said the measures were needed to combat what they called Silicon Valley censorship. In particular, they objected to the decisions of some platforms to bar President Donald J. Trump after the attack on the Capitol on Jan. 6, 2021.

Share this:

“Supreme Court Poised to Reconsider Key Tenets of Online Speech”

NYT:

For years, giant social networks like FacebookTwitter and Instagram have operated under two crucial tenets.

The first is that the platforms have the power to decide what content to keep online and what to take down, free from government oversight. The second is that the websites cannot be held legally responsible for most of what their users post online, shielding the companies from lawsuits over libelous speech, extremist content and real-world harm linked to their platforms.

Now the Supreme Court is poised to reconsider those rules, potentially leading to the most significant reset of the doctrines governing online speech since U.S. officials and courts decided to apply few regulations to the web in the 1990s.

On Friday, the Supreme Court is expected to discuss whether to hear two cases that challenge laws in Texas and Florida barring online platforms from taking down certain political content. Next month, the court is scheduled to hear a case that questions Section 230, a 1996 statute that protects the platforms from liability for the content posted by their users.

The cases could eventually alter the hands-off legal position that the United States has largely taken toward online speech, potentially upending the businesses of TikTok, Twitter, Snap and Meta, which owns Facebook and Instagram.

“It’s a moment when everything might change,” said Daphne Keller, a former lawyer for Google who directs a program at Stanford University’s Cyber Policy Center.

The cases are part of a growing global battle over how to handle harmful speech online. In recent years, as Facebook and other sites attracted billions of users and became influential communications conduits, the power they wielded came under increasing scrutiny. Questions arose over how the social networks might have unduly affected electionsgenocides, wars and political debates.

Share this:

“Donald Trump prepares for his return to Facebook and Twitter”

NBC News:

Mounting a comeback for the White House, Donald Trump is looking to regain control over his powerful social media accounts.

With access to his Twitter account back, Trump’s campaign is formally petitioning Facebook’s parent company to unblock his account there after it was locked in response to the U.S. Capitol riot two years ago.

“We believe that the ban on President Trump’s account on Facebook has dramatically distorted and inhibited the public discourse,” Trump’s campaign wrote in its letter to Meta on Tuesday, according to a copy reviewed by NBC News.

Trump’s campaign didn’t threaten a lawsuit, as some sources close to Trump thought he would. It instead talked about the importance of free speech and petitioned Meta for a “meeting to discuss President Trump’s prompt reinstatement to the platform.”

A Meta spokesperson declined to comment about Trump beyond saying the company “will announce a decision in the coming weeks in line with the process we laid out.”

Facebook and Twitter banned Trump a day after a mob of his supporters — many of whom have admitted in federal court that they were whipped up by his lies of a stolen election — stormed the Capitol and interfered with Congress as it was counting the electoral votes to certify Joe Biden’s 2020 presidential victory.

Facebook ultimately decided to institute a limited ban on Trump that would come up for review after two years, starting Jan. 7 of this year.

Twitter planned a permanent ban, but new owner Elon Musk reinstated Trump’s account on Nov. 19 and then criticized the company’s previous leadership for the ban.

Trump, however, hasn’t yet tweeted.

Share this:

“Google Didn’t Show Bias in Filtering Campaign-Ad Pitches, FEC Says”

WSJ:

The Federal Election Commission has dismissed a complaint from Republicans that Google’s Gmail app aided Democratic candidates by sending GOP fundraising emails to spam at a far higher rate than Democratic solicitations. 

The Republican National Committee and others contended that the alleged benefit amounted to unreported campaign contributions to Democrats. But in a letter to Google last week, the FEC said it “found no reason to believe” that Google made prohibited in-kind corporate contributions, and that any skewed results from its spam filter algorithms were inadvertent. 

“Google has credibly supported its claim that its spam filter is in place for commercial reasons and thus did not constitute a contribution” within the meaning of federal campaign laws, according to an FEC analysis reviewed by The Wall Street Journal. 

The RNC and other campaign committees argued that Google’s “overwhelmingly disproportionate suppression of Republican emails” constituted an illegal corporate contribution to Democratic candidates.  

But the FEC disagreed, finding that Google established that it maintains its spam filter settings to aid its business in keeping out malware, phishing attacks and scams, and not for the purpose of benefiting any political candidates. 

Share this:

Today’s Must-Read: “What the Jan. 6 probe found out about social media, but didn’t report”

WaPo:

The Jan. 6 committee spent months gathering stunning new details on how social media companies failed to address the online extremism and calls for violence that precededthe Capitol riot.

The evidence they collected was written up in a 122-page memo that was circulated among the committee, according to a draft viewed by The Washington Post. But in the end, committee leaders declined to delve into those topics in detail in their final report, reluctant to dig into the roots of domestic extremism taking hold in the Republican Party beyond former president Donald Trump and concerned about the risks of a public battle with powerful tech companies, according to three people familiar with the matter who spoke on the condition of anonymity to discuss the panel’s sensitive deliberations.

Congressional investigators found evidence that tech platforms — especially Twitter — failed to heed their own employees’ warnings about violent rhetoric on their platforms and bent their rules to avoid penalizing conservatives, particularly then-president Trump, out of fear of reprisals.The draft report details how most platforms did not take “dramatic” steps to rein in extremist content until after the attack on the Capitol, despite clear red flags across the internet.

“The sum of this is that alt-tech, fringe, and mainstream platforms were exploited in tandem by right-wing activists to bring American democracy to the brink of ruin,” the staffers wrote in their memo. “These platforms enabled the mobilization of extremists on smaller sites and whipped up conservative grievance on larger, more mainstream ones.”

But little of the evidence supporting those findings surfaced during the public phase of the committee’s probe, including its 845-page report that focused almost exclusively on Trump’s actions that day and in the weeks just before.

That focus on Trump meant the report missed an opportunity to hold social media companies accountable for their actions, or lack thereof, even though the platforms had been the subject of intense scrutiny since Trump’s first presidential campaign in 2016, the people familiar with the matter said.

Confronting that evidence would have forced the committee to examine how conservative commentators helped amplify the Trump messaging that ultimately contributed to the Capitol attack, the people said — a course that some committee members considered both politically risky and inviting opposition from some of the world’s most powerful tech companies, two of the people said.

The whole thing is a must-read.

Share this: