【】

On Sunday afternoon we learned about Facebook’s internal content moderation rules from a massive leak by The Guardian. It confirmed what a lot of people had long suspected: Facebook is making it up as they go along and we’re the collateral damage.
The leaked moderator documents cover how to deal with depictions of things like self-harm and animal cruelty in exceedingly detailed ways. A first read through suggests that the company attempted to create a rule for every conceivable situation, and if they missed one, well they’d write that guideline when it came up. It suggests they think that this is just a question of perfecting the rules, when they've been off-base from the outset.
SEE ALSO:Facebook wants to own the world, not save itFacebook, like much of Silicon Valley, distrusts people and their wisdom. They worship efficiency and code and they do their best to program around the messy human interactions of life. Which, as we’ve seen time and again, is not an ideal way to manage and run a community populated by nearly two billion humans.
The devil is in the details of course, but that’s why most communities have informal and formal rules and ethics about how you're supposed to act and behave. It’s messy, but it largely works. You decide on what principles and values are important to you, and then those guide specific day-to-day interactions among people.
All we've had to go on about Facebook’s guiding principles have been generic platitudes from Zuckerberg until a few months ago, when he gave us a few thousand words of generic platitudes. The company has always clung mightily to vagueness – and secrecy. Facebook says it wants to protect free speech and to avoid censorship. But censorship is something to be avoided because it’s a mis-calibration: Something valuable was prohibited or erased. The banned book was worth reading. The activist's speech needed to be heard. The silencing was a problem because of the values it acted against. Facebook has never understood that. They’ve operated at the level of the particular, and they have studiously avoided the theoretical that makes that particular worth fighting for.
Sure, if Facebook had decided to take an actual stand, they’d have had detractors. But if they'd been transparent about why, their users would have gotten over it. If you have principles, and you stick to them, people will adjust.
Instead, Facebook seems to change their policies based on the level of outrage that is generated. It contributes to a perception of them as craven and exploitative. This is why Facebook lurches from stupid controversy to stupid controversy, learning the hard way every. single. time.
A video of a man murdering his daughter in Thailand needed 24 hours before it came down, despite being a clear policy violation. Breastfeeding photos are not allowed until they are. Child nudity is removed ASAP, even when it's a Pulitzer Prize-winning photo and there's value in keeping it up. Human moderators, faulty though they can be, are good at these types of decisions. They can weigh competing moral and philosophical claims and aim at fairness and justice.
Zuckerberg, like many in Silicon Valley, seems to believe that he has enough control and foresight that nothing will come up that his algorithm didn't predict, and he can handle it when it does. That we're just one more algorithm tweak away from internet utopia. But until they learn that they’re wrong, the rest of us are just the lab rats in their social experiment in hubris.
Facebook laid a minefield and now they're trying to map a path through it. But maybe they should have thought about that before they strewed them everywhere
The company needs to acknowledge that its approach was just fundamentally off. Not "we'll do better" or "we'll try harder" or "we'll update our guidelines." Wrong from the get-go.
If the internet is broken, Facebook helped break it. Now they owe the rest of us for the damage.
Don't bet on it.
Featured Video For You
Bionic skin that's 3D printed could give robots a human touch
TopicsFacebookSocial Media
相关文章
Xiaomi accused of copying again, this time by Jawbone
Imitation is not always the best form of flattery.。 SEE ALSO:Xiaomi's MacBook Air clone is called, w2025-03-01巴西連續8屆世界杯闖進八強,內馬爾比肩大羅貝利|數說(2018年世界杯戰績表)
巴西連續8屆世界杯闖進八強,內馬爾比肩大羅貝利|數說2018年世界杯戰績表)_世界杯 ( 世界杯,克羅地亞 )www.ty42.com 日期:2022-12-07 00:00:00| 評論(已有352025-03-01葡萄牙和韓國二十年前的‘恩怨情仇’(葡萄牙與韓國的恩怨情深嗎)
葡萄牙和韓國二十年前的‘恩怨情仇’葡萄牙與韓國的恩怨情深嗎)_世界杯 ( 葡萄牙,韓國 )www.ty42.com 日期:2022-12-03 00:00:00| 評論(已有355953條評論)2025-03-01【波盈足球】 世足梅西登上球王寶座 C羅成為本屆最大輸家 ( 梅西,美聯社 )
【波盈足球】 世足梅西登上球王寶座 C羅成為本屆最大輸家 ( 梅西,美聯社 )www.ty42.com 日期:2022-12-19 00:00:00| 評論(已有355930條評論)2025-03-01Uber's $100M settlement over drivers as contractors may not be enough
UPDATE: Sept. 7, 2016, 4:41 p.m. EDT 。 A ruling in a different case on Wednesday, Sept. 7 may have ch2025-03-01- 英格蘭隊3:0戰勝威爾士隊晉級16強法國3-0威爾士)_足球 ( 英格蘭隊,威爾士 )www.ty42.com 日期:2022-12-01 00:00:00| 評論(已有356015條評論)2025-03-01
最新评论