In the midst of ongoing debate over Facebook's approach to content moderation and amplification via its platforms, the social media giant has released two new ads, one calling for more government regulation of the tech industry and the other attempting to humanize the people behind the company's decision-making.
Facebook has recently unveiled two new profile videos of personnel working on its content moderation difficulties, as seen in this example.
The videos begin with each employee holding a photograph of their family, followed by a brief explanation of their work and their opinions on regulation.
According to 'Rochelle' in the video above:
“You should be able to understand who has your data and how they use it. Federal legislation can give our platforms and other platforms guidelines so we can have a consistent approach.”
'Jack,' a Facebook content moderator, explains in the second video:
“We make a lot of difficult decisions. We work in the spectrum of freedom of expression versus content moderation and constantly trying to figure out where on that spectrum we should land. I don’t know if that is right to have a private corporation like Facebook, dictating what those boundaries are.”
The music, lighting, and format are all intended to foster a more sympathetic and human connection to these issues, emphasizing that Facebook employs 40,000 people working on these issues, including smart, everyday people like Rochelle and Jack, and that it isn't some faceless, corporate behemoth hell-bent on world dominance.
This is a refreshing change of pace from Facebook's PR team's harsh, dismissive response to charges made by former product manager Frances Haugen, who leaked a slew of internal research studies detailing the company's efforts to better understand the impact of its platforms.
These internal records demonstrate that Facebook is well aware of the problems that its apps might create, according to Haugen, who testified before the Senate last week, but it has been unwilling to act, at least in some situations, presumably because of the possible impact on its bottom line.
Facebook has refuted these allegations, stating that it undertakes such research to improve. Many commenters, however, have questioned Facebook's retaliation against Haugen, as well as its aggressive, even patronizing tone in attempting to dispel misconception.
This new method appears to offset that by attempting to present a different viewpoint on Facebook's actions, with each clip directing users to this site, where Facebook underlines the need for revised internet legislation.
According to the mini-site:
“While we at Facebook are working to make progress, we know that we can’t - and shouldn’t - do it alone. That’s why we support regulations to set clear and fair rules for everyone, and support a safe and secure open internet where creativity and competition can thrive.”
For a long time, Facebook has advocated for improved regulation, which would remove the burden of decision-making from its platform and assuage worries about its methods and intentions.
It's unclear how those new regulations would function, but Facebook's reasoning makes sense in that individual platforms shouldn't be put in the position of deciding what's acceptable and what isn't, especially given the scale of their power over the modern media cycle and speech.
If such rules were set by an oversight body, one of Facebook's major issues would be solved, thereby nullifying the charge that it is trying to control or manage speech. However, the intricacies involved, particularly when it comes to algorithmic amplification, will be difficult to implement and will take time, and they may make it much more difficult for new players to enter the sector if they must meet a variety of new requirements to coordinate with such developments.
You'll also notice that the accompanying screenshot, which outlines the regulatory changes that Facebook supports, makes no mention of possible algorithm modifications. According to Haugen, engagement-aligned amplification is producing severe problems in terms of incentive, exposure, and subsequent repercussions.
That may be a far bigger battle, and Facebook hasn't addressed it yet. Because Facebook relies on algorithmic matching to maximize usage and revenues, so it'll be interesting to observe which particular elements of change Facebook will favor and which it will fight using these new video clips.
In any case, we're on our way to a better informed, constructive debate about this topic, with Haugen's stance laying the groundwork for a fresh approach to this critical issue.