Facebook won't ban political ads or fact-check them
Facebook isn't changing its stance on political advertising.
The Social Network has been listening to the criticisms of its position on political ads and not subjecting them to fact checks, it's observed the decisions from Twitter and Google to put increased limits, even outright bans, on campaign advertising on their networks. But Zuck and Co. aren't shifting. They're not changing their minds on this.
But they are looking to give users more control over the ways in which they can be targeted by advertisers, including political campaigns, as well a new option to see fewer political ads in their feeds, if they so choose.
Facebook's latest update on the political ad front is unlikely to appease the chorus of critics that have repeatedly called on the company to take more action on questionable claims in political ads. But then again, according to Facebook, that's not the key problem anyway, at least not based on its experience.
Here's what's been announced.
1. New Settings in the Ad Library
Facebook says that its Ad Library, which provides complete oversight into spending, targeting and financing of political ad campaigns, makes it "the most transparent platform in political advertising".
But after meeting with various groups - including political activists, NGOs, nonprofits and volunteers - Facebook has decided to add more data to the Ad Library, providing insight into the specific targeting elements to highlight exactly how each political ad has been focused.
As per Facebook:
"We're adding ranges for Potential Reach, which is the estimated target audience size for each political, electoral or social issue ad so you can see how many people an advertiser wanted to reach with every ad."
This addresses the concern of microtargeting - amid the varying accusations related to the Cambridge Analytica scandal was the suggestion that political groups have been using Facebook's advanced ad targeting to hone in on very small, niche audience subsets, in order to influence these people based on their very specific fears and pain points, essentially using these elements against them.
Facebook says that microtargeting of this type is actually not the issue that it's been made out to be - according to internal data, 85% of ad spend by US presidential candidates on Facebook has been for campaigns targeted to audiences estimated to be greater than 250,000 people.
This new measure in the Ad Library will address this concern, so users can see for themselves just how much Facebook's advanced ad targeting capacity is being used to persuade smaller audience subsets.
2. Improved Ad Library Search and Filtering
In addition to this, Facebook's also making it easier to search the Ad Library to get more information on how political ads are being targeted, with improved filters and tools to hone in your queries.
"We are adding the ability to search for ads with exact phrases, better grouping of similar ads, and adding several new filters to better analyze results — e.g. audience size, dates and regions reached. This will allow for more efficient and effective research for voters, academics or journalists using these features."
If you have a concern as to how Facebook ads are being used, you'll now have more capacity to locate and analyze specific campaigns.
This will no doubt be a key tool used by academics and researchers to spot potential issues in political ads on the platform.
3. Control Over Ad Targeting
Facebook's also looking to give users more control over the ways in which political campaigns can target them, specifically through the use of third-party lists and Custom Audiences.
"Later this month we'll begin rolling out a control to let people choose how an advertiser can reach them with a Custom Audience from a list. These Custom Audiences are built when an advertiser uploads a hashed list of people’s information, such as emails or phone numbers, to help target ads. This control will be available to all people on Facebook and will apply to all advertisers, not just those running political or social issue ads."
The control will enable users to opt-out of Custom Audience targeting - or conversely, make themselves eligible to see ads if an advertiser has used a list to exclude them.
"For example, if a candidate has chosen to exclude you from seeing certain fundraising ads because they don’t think you will donate again, but you still want a chance to see those ads, you can stop yourself from being excluded."
The idea here is to give users more control over the platform's advanced ad targeting features, and stop themselves from being potentially categorized based on data from an advisory group, like Cambridge Analytica, and the data insights they might possess.
4. See Fewer Political Ads
And lastly, Facebook is also adding a new way for users to opt-out of political ads - though not completely:
"Seeing fewer political and social issue ads is a common request we hear from people. That’s why we plan to add a new control that will allow people to see fewer political and social issue ads on Facebook and Instagram. This feature builds on other controls in Ad Preferences we’ve released in the past, like allowing people to see fewer ads about certain topics or remove interests."
Note that you can't opt out of political ads entirely, but you can choose to see fewer of them. That enables Facebook to keep serving political ads, generating revenue from such, while also providing some capacity for users to reduce the influx of political messaging heading into the campaign season.
So as noted, Facebook isn't changing its stance, it still won't subject political ads to fact-checking. But it will give users more control over what they see, and the data they can access about each ad.
Again, that likely won't be enough to appease critics calling for more action, and with virtually all of these tools reliant on individual users taking some level of action off their own bat, they're also not likely to see significant take-up, in relative terms.
But then again, Facebook says that all of this is largely irrelevant anyway - as noted by Facebook's Andrew Bosworth in a leaked memo earlier this week, most of the coverage around Facebook's political influence has been overblown, and the impact of Facebook ad campaigns, and misinformation specifically, on voting behavior is not as significant as has been portrayed.
According to Bosworth:
"Misinformation from the candidates themselves was not considered a major shortcoming of political advertising on FB in 2016 even though our policy then was the same as it is now. These policies are often covered by the press in the context of a profit motive. That’s one area I can confidently assure you the critics are wrong."
So, according to Bosworth, fact-checking political campaigns is almost a moot point, because it's not a major factor in the content coming from the candidates themselves anyway. Add to that the aforementioned stat around microtargeting (only 15% of ad spend by US presidential candidates has been for ad campaigns targeted to audiences of less than 250,000 people), and you can see why, from Facebook's point of view, the argument around both elements is over-hyped.
If these stats are correct, then Facebook may well be right, fact-checking may not be critical - and Facebook does additionally note that even if these ads aren't fact-checked, political campaigns are still held to Facebook's broader community standards on ad claims.
Facebook's stance remains that it should not be forced to act as referee in political debate, that it's not in any position to say what's true and what's not in such claims.
In fact, Facebook has repeatedly called for increased regulation in this respect, pointing to the 'Honest Ads Act' as a possible solution.
"Ultimately, we don’t think decisions about political ads should be made by private companies, which is why we are arguing for regulation that would apply across the industry. The Honest Ads Act is a good example — legislation that we endorse and many parts of which we’ve already implemented — and we are engaging with policymakers in the European Union and elsewhere to press the case for regulation too. Frankly, we believe the sooner Facebook and other companies are subject to democratically accountable rules on this the better."
That actually, based on Facebook's additional notes, makes some sense - and while it still doesn't feel right that Facebook should allow outright lies in political ads, it's really the less black and white candidate claims that become problematic. And if misinformation isn't being spread by the candidates themselves anyway, maybe Facebook's position isn't as problematic as it may seem.
The same goes for microtargeting - Facebook also says that it has considered limiting how its ad targeting can be used, but as noted by Facebook Product Director Rob Leathern, various research reports have suggested that implementing limits on ad targeting only benefits the candidates with greater financial means.
"With its low cost and high precision, microtargeting technology has quickly become a business and political necessity. It allows candidates with limited financial resources to communicate their message to specific audiences at a fraction of the cost of conventional communication channels. Banning the use of such technologies effectively secures the elections for the candidates with the greatest financial support from corporations and super PACs who can bankroll expensive marketing campaigns."
And again, Facebook's data suggests this hasn't been as significant an issue as portrayed anyway.
No, candidates shouldn't be able to lie in ads they run on the largest network of interconnected people in the world. But if the data here is correct, Facebook's stance is, at the least, defensible, even if you don't agree with it.
But here's the key concern - Facebook can absolutely influence voter behavior, and it has done so in various elections over the past decade.
Facebook has even openly promoted this fact - back in 2010, Facebook claimed that around 340,000 extra voters turned out to take part in the US Congressional elections because of a single election-day Facebook message.
The scale of Facebook's network gives it immense power in this respect - but the question we're all now grappling with is how Facebook does this exactly.
Is it through advertising? Facebook says no, that's a minor element at play. Is it through manipulation via foreign interference? Facebook also says no - while it has happened, it's impact has not been as significant as reported.
The concern then is that if you eliminate these elements as concerns - and Facebook says that it's gone to significant effort to reduce the potential impact of the latter - then you're forced to look elsewhere, which could be even more problematic.
As noted by Bosworth in his memo:
"At the end of the day we are forced to ask what responsibility individuals have for themselves."
Based on my own research, I suspect this statement is likely correct. Facebook does have huge influence over how people vote, but it's not necessarily due to clever manipulation or complex psychological profiling of different audience groups. Facebook gives everyone a voice, and that has become a polarizing force because we're exposed to a much greater breadth of people's opinions.
Now, you know what your hairdresser thinks about politics, what your mechanic thinks, what the guy three doors down has to say about climate change. You didn't have such access before, and that increased exposure has shifted each of us more to one side or another. Add to this the fact that media organizations have essentially been incentivized to publish more divisive, more polarizing, more biased views in order to maximize engagement, and you can see how divisions have been deepened, without us even realizng it.
In this context, Facebook is still at least partially to blame - Facebook's algorithm prioritizes engagement, which has been a significant factor in guiding editorial decisions to more extreme ends of the political divide, where impassioned argument leads to more comments, more shares, etc. But it's also largely based on inherent bias within us, playing to our fears and concerns, and poking at them, just enough to keep us enraged and active.
In this sense, it's focused misinformation, along with selective reporting, within the broader mass-media that's to blame for ill-informed debate, and consequent voting behavior. It's not political ads, not Russian interference. The divides are being motivated by profit, and Facebook, in this sense, is merely the coliseum hosting the battles, with the aristocracy in the stands, watching on.