This month Facebook reminded the world that it is not a neutral social media platform, but also an editor of news, information, and history in its own right.
In early September the social media giant removed a post by Norwegian writer Tom Egeland, who had included “Napalm Girl”, a Pulitzer Prize winning photo taken by Nick Ut in 1972, from a list of images that “changed the history of warfare.”
Facebook claimed that the image violated their community standards by featuring the image of a nude child, but Egeland protested the decision and was suspended from Facebook. When Norway’s Prime Minister Erna Solberg weighed in by sharing the image herself in protest, her post was deleted.=
That’s right, Facebook removed a post from an acting Prime Minister (of the Conservative Party, no less).Click to tweet
Admitting to a mistake
Facing international condemnation, eventually Facebook allowed the photo, releasing this statement:
‘An image of a naked child would normally be presumed to violate our Community Standards, and in some countries might even qualify as child pornography. In this case, we recognize the history and global importance of this image in documenting a particular moment in time. Because of its status as an iconic image of historical importance, the value of permitting sharing outweighs the value of protecting the community by removal, so we have decided to reinstate the image on Facebook where we are aware it has been removed.
While the company admitted it was in the wrong, this incident offers several important reminder to all of us living in a world where billions use Facebook everyday.
1. Yes, social media is ‘edited’
It’s not for nothing that Espen Egil Hansen, editor of the Norwegian newspaper that broke the story, said that Mark Zuckerberg had failed as “the world’s most powerful editor.”
It’s easy to think of Zuckerberg as a wildly successful entrepreneur, but he is also an editor–one more powerful than any of the media barons of the past because of the near-universal reach of his platform. Like an editor, he oversees the establishment of rules that control the type of information deemed appropriate for public consumption.
We like to think of social media as organic and self-policing, but in reality private companies like Facebook have the right to enforce their community guidelines at their own discretion. Just like a traditional media company such as a newspaper, they are not beholden to explain their actions in detail and usually do so only when there is a large enough public outcry.
2. Universal rules are the enemy of nuance
While this “napalm girl” controversy grabbed headlines because media and politicians became involved, there are many other posts and stories that are quietly censored without public debate.
It’s now clear that decisions to remove information are carried out without the subtle considerations that a normal editor would use. The public statement from Facebook made clear that recognition of the “history and global importance of this image in documenting a particular moment in time” only took place after they had taken it down, and after a prime minister expressed her outrage.
Why does this happen? Unlike a traditional media outlet, Facebook doesn’t cater to a local audience. It’s community guidelines are not tailored for a national readership, where media balances its audience’s moral limits, freedom of speech, and journalistic responsibility. They are a one-size fits all rulebook, and the people who review violations may be sitting on the other side of the world.
Consider that when “napalm girl” was taken in 1972, it was published on the front page of newspapers all around the world. Editors knew the image was shocking, but they also knew that the reality of the war was shocking. There may have been healthy debate in newsrooms about whether to run it, but ultimately the decision to do so changed the course of history as the photo was credited with changing public opinion about the war.
Facebook’s decision, by contrast, was virtually automatic: this image violates Community Standards. Remove it.
3. Collectively, these editorial decisions shape our thinking
This wouldn’t be a problem, except for the fact that an enormous number of people all around the world rely on Facebook to understand the world we live in. A recent poll [http://learningenglish.voanews.com/a/most-americans-get-news-from-social-media/3352165.html] found that 62% of Americans get their news from social media–44% from Facebook alone. Even more dangerously, a majority only used one social media platform for their news, meaning that Facebook is the only way millions of Americans get their news.
So while Facebook can do whatever it wants a private company, it has an ethical responsibility to do better–the same responsibility that governs good journalism across the word. As Nicole Smith Dahmen wrote in Quartz:
It is true that Facebook is a private company with a legal right to censor content. But as a global giant that claims in its mission statement ‘to give people the power to share and make the world more open and connected,’ Facebook has an ethical responsibility to facilitate the free flow of information and ideas, especially news. Instead, Facebook is giving users a dangerously manipulated view of that world and contributing to the age of truthiness.
4. Diversification of knowledge is important
All in all, this is a good example of why it is important to step away from the news feed occasionally to look for other sources of information. Any monopoly of information is dangerous. And it may be even more dangerous when this monopoly operates according to context-blind guidelines and algorithms–which recently promoted a 9/11 conspiracy theory in Facebook’s Trending Topics section.