“What Meta’s New Studies Do—and Don’t—Reveal About Social Media and Polarization”

Wired:

LAST WEEK, THE first papers from a collaboration between Meta’s Facebook and a team of external researchers studying the 2020 election were finally published. Two of these studies asked: Are we trapped in filter bubbles, and are they tearing us apart? The results suggest that filter bubbles are at least somewhat real, but countering them algorithmically doesn’t seem to bring us any closer together.

Some are interpreting these results as proof that Facebook divides us. Others are claiming these experiments are a vindication of social media. It’s neither.

The first study tried to figure out whether we’re really in informational echo chambers, and if so, why. Unsurprisingly, the segregation in our information diets starts with who we follow. This mirrors offline life, where most people’s in-person social networks are highly segregated.

But what we actually see in our Feed is more politically homogeneous than what is posted by those we follow, suggesting that the Feed algorithm really does amplify the ideological leanings of our social networks.

There are even larger partisan differences in what we engage with, and Facebook, like pretty much every platform, tries to give people more of what they click, like, comment on, or share. In this case, it looks like the algorithm is sort of meeting human behavior halfway. The difference in our information diets is partly due to what we’ve chosen, and partly the result of using computers to guess—often correctly—what buttons we’ll click….

Many people will be looking to the current batch of experiments to either crucify or exonerate Facebook. That’s not what they do; this is bigger than Facebook, and these studies are early results in a new field. Meta should be commended for undertaking open research on these significant topics. Yet this is the culmination of work announced three years ago. In the face of layoffs and criticism, the appetite for open science on hard questions may be waning across the industry. I’m aware of at least one large research project Meta recently canceled, and the company said it “does not have plans to allow” another wave of election research in 2024. Many in the research community support a bill called PATA, which would give the National Science Foundation authority to vet and prioritize research projects which platforms would be obligated to support.

Simultaneously, the AI era is dawning, and our information ecosystem is about to get a lot weirder. We’re going to need a lot more open science on the frontiers of media, machines, and conflict.

Share this: