This week Facebook announced, with much fanfare, that it will temporarily ban all political advertising after polls close on November 3rd, “to reduce opportunities for confusion or abuse.”
Unfortunately, this performative move won’t do much of anything to address the very real threat of chaos and disinformation in the wake of the election. And at the same time that Facebook is seeking kudos for its political ad moratorium, it’s making another major change: turning on algorithmic amplification for posts within groups. This means that you won’t just see posts from groups that you’ve signed up for, but also posts from other groups that Facebook thinks you should see.
Evan Greer is an activist, musician, and writer based in Boston. She’s the deputy director of Fight for the Future, the viral digital rights group known for organizing the largest online protests in human history.
This change will dramatically increase the risk that false and inflammatory content will go viral. Facebook groups will grow rapidly. The algorithmic boost, nudging like-minded people into each others’ filter bubbles, will supercharge recruitment for toxic and harmful groups—the cesspools where white supremacist conspiracy theories are born. There will also be a massive influx of trolls and conflict in existing groups that are currently mostly functional. For example, it’s not hard to imagine how discussion groups for LGBTQ parents, perhaps the last vestige of Facebook with any positive value in my life, will be affected by this when our intra-community discussions start showing up in the feeds of random homophobes.
Facebook theatrically banning political ads while supercharging its rage machine is the perfect example of the platform making cosmetic changes to appease critics while plowing full steam ahead with a business model that’s fundamentally incompatible with democracy and human rights. If Facebook really wants to avoid being used to poison and undermine democracy, it needs to take a much more significant step than banning certain types of ads. Instead, the company should immediately shut down the algorithms across its platform that artificially amplify and suppress users’ organic posts in a quest for maximum “engagement” (read: advertising dollars). Restoring the News Feed’s chronological setting, which would show people what they signed up to see rather than what Facebook thinks they want to see, might just save what’s left of our democracy.
In reality, no one will need to spend money on advertisements to make dangerous and misleading content go viral in the wake of this election. Provocative posts spread like wildfire during major political moments like these. I can speak from personal experience. My organization, Fight for the Future, hasn’t spent a penny on Facebook ads in years, but we regularly get content to go viral during big moments, like the repeal of net neutrality, major congressional hearings, or fiery political debates. We make our posts interesting, provocative, and shareable—but we also ensure they are accurate and don’t promote harmful ideologies.
Many online actors, whether they’re a state-backed coordinated disinformation campaign or just a bigoted keyboard warrior, have no such scruples. And Facebook’s algorithm, which is optimized for engagement-at-all-costs, is there to constantly fan the flames. It finds the most incendiary takes on the platform and exploits its massive trove of behavioral data to inject hateful and misleading information directly into the minds of the people most susceptible to political manipulation.
A bombshell report in The Wall Street Journal earlier this year showed that Facebook executives are well aware of the harm this surveillance-capitalist machine causes. An internal audit showed that more than 60 percent of all people who joined hate groups on the platform found them through Facebook’s recommendation. But Facebook’s rage-inducing algorithm is much more lucrative than its entire political ad business, which will account for less than 1 percent of the company’s 2020 revenue. That’s why it’s banning political ads while turning the volume up to max on its profitable, and dangerous, amplification algorithm.
Facebook’s billionaire CEO Mark Zuckerberg has said repeatedly that his company should not be the “arbiter of truth.” I actually agree with him, and I have argued against more aggressive moderation or fact-checking of social media posts, which will always result in collateral damage and the silencing of marginalized voices and opinions. But if Facebook doesn’t want to be responsible for determining what is and isn’t true, it also shouldn’t be deciding what content goes viral and what content no one sees—especially not in the immediate aftermath of what is perhaps the highest-stakes presidential election in US history.