“Facebook hides data showing it harms users. Outside scholars need access.”

Nate Persily in WaPo opinion:

The disclosures made by whistleblower Frances Haugen about Facebook — first to the Wall Street Journal and then to “60 Minutes” — ought to be the stuff of shareholders’ nightmares: When she left Facebook, she took with her documents showing, for example, that Facebook knew Instagram was making girls’ body-image issues worse, that internal investigators knew a Mexican drug cartel was using the platform to recruit hit men and that the company misled its own oversight board about having a separate content appeals process for a large number of influential users. (Haugen is scheduled to appear before a Congressional panel on Tuesday.)

Facebook, however, may be too big for the revelations to hurt its market position — a sign that it may be long past time for the government to step in and regulate the social media company. But in order for policymakers to effectively regulate Facebook — as well as Google, Twitter, TikTok and other Internet companies — they need to understand what is actually happening on the platforms.

Whether the problem is disinformation, hate speech, teenagers’ depression or content that encourages violent insurrection, governments cannot institute sound policies if they do not know the character and scale of these problems. Unfortunately, only the platforms have access to the relevant data, and as the newest revelations suggest, they have strong incentives not to make their internal research available to the public. Independent research on how people use social media platforms is clearly essential.

After years of frustration — frustration also felt by many Facebook employees trying to do the right thing — I resigned last year as co-chair of an outside effort to try to get the company to share more data with researchers. Facebook’s claims of privacy dangers and fears about another Cambridge Analytica scandal significantly hindered our efforts. (A researcher at data firm Cambridge Analytica violated users’ privacy, prompting an investigation by the federal government into Facebook’s data-protection practices that led to a $5 billion fine.)

When Facebook did finally give researchers access to data, it ended up having significant errors — a problem that was discovered only after researchers had spent hundreds of hours analyzing it, and in some cases publishing their findings (about, for example, how disinformation spreads).

So we are now at a standstill, where the public does not trust Facebook on research and data that it releases, and Facebook says existing law (including the Cambridge Analytica settlement) prevents it from sharing useful data with outside researchers. Congress has the ability to solve this problem by passing a law granting scholars from outside the social media companies access to the information held by them — while protecting user privacy. (I have drafted text for a law along these lines, which I call the “Platform Transparency and Accountability Act.”)…

Share this: