Skip links

How can Facebook possibly police itself?

This week’s New Yorker carried an article by Andrew Marantz whose main thrust was that Facebook is not doing a good job of moderating its content. The result is that all sorts of people and groups that, in the view of many experts the reporter interviewed, should not be able to use the electronic megaphone of Facebook, are allowed to do so. The list of such offenders is long: the white-nationalist group Britain First, “autocratic Brazilian politician” Jair Bolsonaro, and of course, the President of the United States, Donald Trump.

Facebook has an estimated 15,000 content moderators working around the world, constantly monitoring what its users post and taking down material that violates what the company calls its Implementation Standards. Some decisions are easy: you aren’t allowed to post a picture of a baby smoking a cigarette, for example. But others are harder, especially when the people doing the posting are prominent figures who are likely to generate lots of eye-time and thus advertising revenue for the company.

The key to the dilemma that Facebook faces was expressed by former content moderator Chris Gray, who wrote a long memo to Facebook CEO Mark Zuckerberg shortly after leaving the company. He accused Facebook of not being committed to content moderation and said: “There is no leadership, no clear moral compass.”

Technology has allowed Facebook to achieve what in principle looks like a very good thing: in the words of its stated mission, to “bring the world closer together”. Unfortunately, when you get closer to some people, you wish you hadn’t. And while Zuckerberg is an unquestioned genius when it comes to extracting billions from a basically simple idea, he and his firm sometimes seem to have an oddly immature notion of human nature.

Read more at Mercatornet

Share with Friends: