This week Mark Zuckerberg delivered a speech in which he said“providing everyone a voice” and struggle “to uphold a wide sense of freedom of expression as feasible.” That seems to be great. Freedom of expression is a cornerstone, if not the cornerstone, of progressive democracy.
The obstacle is that Facebook doesn’t allow free speech; it grants free amplification. No one would extremely care about anything you upload to Facebook, no matter how dishonest or hateful, if people had to travel to your particular page to understand your rantings, as in the very beginning days of the site.
“Some people consider providing more people a voice is a striking division rather than inducing us together,” Zuckerberg said. “More people over the spectrum understand that completing the political consequences they think the matter is more significant than every person having a voice.”
But what people truly understood on Facebook is what’s in their News Feed … and its contents, in turn, are defined not by providing everyone an equal voice, and not by a strict sequential timeline. What you read on Facebook is managed completely by Facebook’s algorithm, which omits much — judges much, if you incorrectly think the News Feed is free speech — and amplifies little.
What is amplified? Two patterns of content. For original content, the algorithm optimizes for commitment. This, in turn, means people waste more time on Facebook, and therefore more time in the company of that another form of content which is amplified: paid advertising.
Of course, this isn’t perfect. As Zuckerberg notes in his speech, Facebook operates to stop things like deceptions and medical misinformation from going viral, even if they’re oppositely anointed by the algorithm. But he has clearly decided that Facebook will not strive to obstruct paid political misinformation from going viral.
The broader issue, though, is that Facebook appears to believe that if an algorithm is content-agnostic, it is hence fair. When Zuckerberg speaks about giving people a voice, he really intends giving those people decided by Facebook’s algorithm a voice. When he says “People having the ability to represent themselves at scale is a new variety of force in the world.
The thinking is expressly that any human decision based on content behind the absolute least required by law and intended by the social contract — i.e. separating out hate speech, abuses, or critical medical misinformation, all of which he emphasizes in his speech — is risky and wrong, and that this works for both native content and paid advertising. According to this idea, Facebook’s algorithm, so longspun as it is content-agnostic, is clearly fair.