WHEN YOUNG MIE Kim began studying political ads on Facebook in August of 2016—while Hillary Clinton was still leading the polls— few people had ever heard of the Russian propaganda group, Internet Research Agency. Not even Facebook itself understood how the group was manipulating the platform’s users to influence the election. For Kim, a professor of journalism at the University of Wisconsin-Madison, the goal was to document the way the usual dark money groups target divisive election ads online, the kind that would be more strictly regulated if they appeared on TV. She never knew then she was walking into a crime scene.
Over the last year and a half, mounting revelations about Russian trolls’ influence campaign on Facebook have dramatically altered the scope and focus of Kim’s work. In the course of her six-week study in 2016, Kim collected mounds of evidence about how the IRA and other suspicious groups sought to divide and target the US electorate in the days leading up to the election. Now, Kim is detailing those findings in a peer-reviewed paper published in the journal Political Communication. The researchers couldn’t find any trace, in federal records or online, of half of the 228 groups it tracked that purchased Facebook ads about controversial political issues in that six-week stretch. Of those so-called “suspicious” advertisers, one in six turned out to be associated with the Internet Research Agency, according to the list of accounts Facebook eventually provided to Congress. What’s more, it shows these suspicious advertisers predominantly targeted voters in swing states like Wisconsin and Pennsylvania.
“I was shocked,” says Kim, now a scholar in residence at the Campaign Legal Center, of the findings. “I sort of expected these dark money groups and other unknown actors would be on digital platforms, but the extent to which these unknown actors were running campaigns was a lot worse than I thought.
Of the 228 groups that purchased political ads about hot-button political issues in the weeks before the 2016 elections, 122 were identified by Professor Kim and her team as “suspicious” — which means that there was no publicly available information about nearly half of the sponsors of Facebook ads featuring hot-button political issues in the weeks before the 2016 elections. In this research, suspicious groups are unidentifiable, untrackable groups that have no public footprints. Professor Kim and her team identified a group as suspicious if no information about the group was found elsewhere, even after reviewing the Federal Election Commission, IRS-based databases, and other research databases.
A quarter of the ads the research examined mentioned candidates, and would be subject to disclosure requirements if aired on TV, but escaped those transparency measures because they were run online.
This secrecy would not be possible on broadcast. While social media companies have proposed new transparency measures, the Honest Ads Act would solidify disclosure requirements by moving the law into the 21st century. The bipartisan legislation aims to ensure that digital political ads are subject to the same transparency requirements that apply to similar ads run on any other medium. The bill would shine a spotlight on some of the digital advertising practices outlined in the Project DATA study by creating a public footprint.