Dark posts

Siva Vaidhyanathan explains why that story about the bot ads on Facebook is so worrying.

On Wednesday, Facebook revealed that hundreds of Russia-based accounts had run anti-Hillary Clinton ads precisely aimed at Facebook users whose demographic profiles implied a vulnerability to political propaganda. It will take time to prove whether the account owners had any relationship with the Russian government, but one thing is clear: Facebook has contributed to, and profited from, the erosion of democratic norms in the United States and elsewhere.

The audacity of a hostile foreign power trying to influence American voters rightly troubles us. But it should trouble us more that Facebook makes such manipulation so easy, and renders political ads exempt from the basic accountability and transparency that healthy democracy demands.

The majority of the Facebook ads did not directly mention a presidential candidate, according to Alex Stamos, head of security at Facebook, but “appeared to focus on amplifying divisive social and political messages across the ideological spectrum — touching on topics from L.G.B.T. matters to race issues to immigration to gun rights.”

They’re a kind of ad that’s peculiar to Facebook, and not in a good way. They’re very targeted and they run very briefly and then disappear entirely.

The service is popular among advertisers for its efficiency, effectiveness and responsiveness. Facebook gives rich and instant feedback to advertisers, allowing them to quickly tailor ads to improve outcomes or customize messages even more. There is nothing mysterious or untoward about the system itself, as long as it’s being used for commerce instead of politics. What’s alarming is that Facebook executives don’t seem to grasp, or appreciate, the difference.

A core principle in political advertising is transparency — political ads are supposed to be easily visible to everyone, and everyone is supposed to understand that they are political ads, and where they come from. And it’s expensive to run even one version of an ad in traditional outlets, let alone a dozen different versions. Moreover, in the case of federal campaigns in the United States, the 2002 McCain-Feingold campaign-finance act requires candidates to state they approve of an ad and thus take responsibility for its content.

None of that transparency matters to Facebook. Ads on the site meant for, say, 20- to 30-year-old home-owning Latino men in Northern Virginia would not be viewed by anyone else, and would run only briefly before vanishing. The potential for abuse is vast. An ad could falsely accuse a candidate of the worst malfeasance a day before Election Day, and the victim would have no way of even knowing it happened. Ads could stoke ethnic hatred and no one could prepare or respond before serious harm occurs.

And that’s the world we’re living in now.

Our best hopes sit in Brussels and London. European regulators have been watching Facebook and Google for years. They have taken strong actions against both companies for violating European consumer data protection standards and business competition laws. The British government is investigating the role Facebook and its use of citizens’ data played in the 2016 Brexit referendum and 2017 national elections.

We are in the midst of a worldwide, internet-based assault on democracy. Scholars at the Oxford Internet Institute have tracked armies of volunteers and bots as they move propaganda across Facebook and Twitter in efforts to undermine trust in democracy or to elect their preferred candidates in the Philippines, India, France, the Netherlands, Britain and elsewhere. We now know that agents in Russia are exploiting the powerful Facebook advertising system directly.

In the 21st-century social media information war, faith in democracy is the first casualty.

Siva’s writing a book about Facebook. We need it.

3 Responses to “Dark posts”