‘Do You Accept Rubles?’ Facebook Admits 10 Million People Saw Russia-Linked Ads

Ok, we’re just going to come right out and say this: if you don’t think it is at least possible that Russia’s activity on Facebook had an impact on the election, then you don’t understand much about social media.

We’ve been over this more times than we care to remember, but for the umpteenth time, platforms like Facebook and Twitter exist to amplify messages and to allow users to connect with each other and share information. That’s their raison d’être.

So when you see articles suggesting otherwise posted to Trump-friendly websites or to blogs run by folks you suspect might be Russophiles, you might fairly ask why those sites and blogs bother incorporating social media sharing buttons. After all, if Facebook and Twitter aren’t good at spreading the proverbial word, well then what’s the use of encouraging people to share your articles via social media?

We’ve written a ton about this over the past several months. Those interested can peruse our entire archive on Facebook and Russia here. The details are, well, damning. And there’s no getting around that.

When you see articles on blogs and other sites with questionable reputations suggesting that this is all one big wild goose chase, just note that those are the same blogs and websites that have been insisting this will all go away for the past nine months. And yet it hasn’t gone away. The investigations have deepened and the public’s interest is piqued. In short: those sites and blogs are lying to you. And in private, they are concerned about what the future holds.

On Monday there was more news on this story. Facebook has now turned over some 3,000 Russia-linked ads to congressional investigators and Adam Schiff thinks this all needs to be public. Here’s his statement:

StatementFacebook

Again: this ain’t goin’ away, no matter what anyone in the increasingly desperate alt-Right blogosphere tells you.

This evening, Elliot Schrage, Facebook’s Vice President of Policy and Communications, released some of the facts surrounding what was turned over to Congress.

As it turns out, some 10 million people in the U.S. saw at least one of the Russia-linked ads. 44% of those views were before the election, 56% after the election was over (so at least Trump’s defenders have that latter stat going for them).

Schrage also notes that “some of the ads were paid for in Russian currency” but goes on to claim that “currency alone isn’t a good way of identifying suspicious activity, because the overwhelming majority of advertisers who pay in Russian currency, like the overwhelming majority of people who access Facebook from Russia, aren’t doing anything wrong.”

Got it. But not really. Because it’s hard for us to imagine a scenario where someone is buying divisive, targeted ads and paying in rubles, but isn’t engaged in some kind of shenanigans. I mean, that’s like saying “yes, some people were buying uranium and paying in North Korean won, but that, in and of itself, isn’t a good way of identifying suspicious activity.”

In any event, you can read Elliot’s entire post below and judge for yourself.

To be sure, it’s not all damning. But it certainly raises more questions than it answers and that is more than enough reason for these ads to be made public.

Via Facebook

Hard Questions: Russian Ads Delivered to Congress

By Elliot Schrage, Vice President of Policy and Communications

What was in the ads you shared with Congress? How many people saw them? 
Most of the ads appear to focus on divisive social and political messages across the ideological spectrum, touching on topics from LGBT matters to race issues to immigration to gun rights. A number of them appear to encourage people to follow Pages on these issues.

Here are a few other facts about the ads:

  • An estimated 10 million people in the US saw the ads. We were able to approximate the number of unique people (“reach”) who saw at least one of these ads, with our best modeling
  • 44% of the ads were seen before the US election on November 8, 2016; 56% were seen after the election.
  • Roughly 25% of the ads were never shown to anyone. That’s because advertising auctions are designed so that ads reach  people based on relevance, and certain ads may not reach anyone as a result.
  • For 50% of the ads, less than $3 was spent; for 99% of the ads, less than $1,000 was spent.

Why do you allow ads like these to target certain demographic or interest groups?
Our ad targeting is designed to show people ads they might find useful, instead of showing everyone ads that they might find irrelevant or annoying. For instance, a baseball clothing line can use our targeting categories to reach people just interested in baseball, rather than everyone who likes sports. Other examples include a business selling makeup designed specifically for African-American women. Or a language class wanting to reach potential students.

These are worthwhile uses of ad targeting because they enable people to connect with the things they care about. But we know ad targeting can be abused, and we aim to prevent abusive ads from running on our platform. To begin, ads containing certain types of targeting will now require additional human review and approval.

In looking for such abuses, we examine all of the components of an ad: who created it, who it’s intended for, and what its message is. Sometimes a combination of an ad’s message and its targeting can be pernicious. If we find any ad – including those targeting a cultural affinity interest group – that contains a message spreading hate or violence, it will be rejected or removed. Facebook’s Community Standards strictly prohibit attacking people based on their protected characteristics, and our advertising terms are even more restrictive, prohibiting advertisers from discriminating against people based on religion and other attributes.

Why can’t you catch every ad that breaks your rules?
We review millions of ads each week, and about 8 million people report ads to us each day. In the last year alone, we have significantly grown the number of people working on ad review. And in order to do better at catching abuse on our platform, we’re announcinga number of improvements, including:

  • Making advertising more transparent
  • Strengthening enforcement against improper ads
  • Tightening restrictions on advertiser content
  • Increasing requirements for authenticity
  • Establishing industry standards and best practices

Weren’t some of these ads paid for in Russian currency? Why didn’t your ad review system notice this and bring the ads to your attention? 
Some of the ads were paid for in Russian currency. Currency alone isn’t a good way of identifying suspicious activity, because the overwhelming majority of advertisers who pay in Russian currency, like the overwhelming majority of people who access Facebook from Russia, aren’t doing anything wrong. We did use this as a signal to help identify these ads, but it wasn’t the only signal. We are continuing to refine our techniques for identifying the kinds of ads in question. We’re not going to disclose more details because we don’t want to give bad actors a roadmap for avoiding future detection.

If the ads had been purchased by Americans instead of Russians, would they have violated your policies?
We require authenticity regardless of location. If Americans conducted a coordinated, inauthentic operation – as the Russian organization did in this case – we would take their ads down, too.

However, many of these ads did not violate our content policies. That means that for most of them, if they had been run by authentic individuals, anywhere, they could have remained on the platform.

Shouldn’t you stop foreigners from meddling in US social issues?
The right to speak out on global issues that cross borders is an important principle. Organizations such as UNICEF, Oxfam or religious organizations depend on the ability to communicate – and advertise – their views in a wide range of countries. While we may not always agree with the positions of those who would speak on issues here, we believe in their right to do so – just as we believe in the right of Americans to express opinions on issues in other countries.

Some of these ads and other content on Facebook appear to sow division in America and other countries at a time of increasing social unrest. If these ads or content were placed or posted authentically, you would allow many of these. Why?
This is an issue we have debated a great deal. We understand that Facebook has become an important platform for social and political expression in the US and around the world. We are focused on developing greater safeguards against malicious interference in elections and strengthening our advertising policies and enforcement to prevent abuse.

As an increasingly important and widespread platform for political and social expression, we at Facebook – and all of us – must also take seriously the crucial place that free political speech occupies around the world in protecting democracy and the rights of those who are in the minority, who are oppressed or who have views that are not held by the majority or those in power. Even when we have taken all steps to control abuse, there will be political and social content that will appear on our platform that people will find objectionable, and that we will find objectionable. We permit these messages because we share the values of free speech – that when the right to speech is censored or restricted for any of us, it diminishes the rights to speech for all of us, and that when people have the right and opportunity to engage in free and full political expression, over time, they will move forward, not backwards, in promoting democracy and the rights of all.

Are you working with other companies and the government to prevent interference that exploits platforms like yours?
The threats we’re confronting are bigger than any one company, or even any one industry. The kind of malicious interference we’re seeing requires everyone working together, across business, government and civil society, to share information and arrive at the best responses.

We have been working with many others in the technology industry, including with Google and Twitter, on a range of elements related to this investigation. We also have a long history of working together to fight online threats and develop best practices on other issues, such as child safety and counterterrorism. And we will continue all of this work.

With all these new efforts you’re putting in place, would any of them have prevented these ads from running?
We believe we would have caught these malicious actors faster and prevented more improper ads from running. Our effort to require US election-related advertisers to authenticate their business will help catch suspicious behavior. The ad transparency tool we’re building will be accessible to anyone, including industry and political watchdog groups. And our improved enforcement and more restrictive content standards for ads would have rejected more of the ads when submitted.

Is there more out there that you haven’t found?
It’s possible. We’re still looking for abuse and bad actors on our platform – our internal investigation continues. We hope that by cooperating with Congress, the Special Counsel and our industry partners, we will help keep bad actors off our platform.

Do you now have a complete view of what happened in this election?
The 2016 US election was the first where evidence has been widely reported that foreign actors sought to exploit the internet to influence voter behavior. We understand more about how our service was abused and we will continue to investigate to learn all we can. We know that our experience is only a small piece of a much larger puzzle. Congress and the Special Counsel are best placed to put these pieces together because they have much broader investigative power to obtain information from other sources.

We strongly believe in free and fair elections. We strongly believe in free speech and robust public debate. We strongly believe free speech and free elections depend upon each other. We’re fast developing both standards and greater safeguards against malicious and illegal interference on our platform. We’re strengthening our advertising policies to minimize and even eliminate abuse. Why? Because we are mindful of the importance and special place political speech occupies in protecting both democracy and civil society. We are dedicated to being an open platform for all ideas – and that may sometimes mean allowing people to express views we – or others – find objectionable. This has been the longstanding challenge for all democracies: how to foster honest and authentic political speech while protecting civic discourse from manipulation and abuse. Now that the challenge has taken a new shape, it will be up to all of us to meet it.

 

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

4 thoughts on “‘Do You Accept Rubles?’ Facebook Admits 10 Million People Saw Russia-Linked Ads

  1. This is enlightening! The amazing thing is that all along, I have assumed that this very website is Russian based – the name and photo seem, so, well, “Russian” that I just assumed that was the case. So what’s being said here is that, in fact, the Russian activity was disguised and was actually pro-Trump. This really demonstrates how little awareness I have – I just didn’t have a clue!

  2. Walt, I’m still not getting this – reverse the arguments and see how it looks? If Russia called to task its news agencies because of the very long-standing US interference in Russian elections everyone would be up in hands about state interference in media. But when the shoe is on the other foot, you want the US state to intervene in the media and you don;t have a problem with that? You have to consider where this line of thinking could take you – sounds like it would be playing right into Trump’s hands…

    1. No, Paul, I don’t. If every single one of these alt-Right blogs pushing outright lies and acting on behalf of the Kremlin were shut down tomorrow I would have absolutely no problem with that. And guess what? that’s exactly what’s going to happen. just not tomorrow.

NEWSROOM crewneck & prints