Sometimes an emoji is just an emoji. Sometimes it may be a threat.

有时候,表情符只是个表情符。有时候,它可能是个威胁。

And with only a few seconds to spare, Facebook moderators have to make the call — even if the text that accompanies the laughing yellow face is in an unfamiliar language.

在短短几秒钟里,Facebook审核员必须要作出决定——即使和黄色笑脸一起出现的文字是一种陌生的语言。

To help with those decisions, Facebook has created a list of guidelines for what its two billion users should be allowed to say. The rules, which are regularly updated, are then given to its moderators.

为了帮助作出此类决定,Facebook制定了一系列指南,用于指导20亿用户的言论。审核员之后会收到这些定期更新的规定。

For Facebook, the goal is clarity. But for the thousands of moderators across the world, faced with navigating this byzantine maze of rules as they monitor billions of posts per day in over 100 languages, clarity is hard to come by.

对Facebook来说,目标很明确。但对世界各地数以千计的审核员来说,在每天监控100多种语言、数十亿计的帖子时,面对错综复杂的规定,是很难做到清晰明确的。

Facebook keeps its rulebooks and their existence largely secret. But The New York Times acquired 1,400 pages from these guidelines, and found problems not just in how the rules are drafted but in the way the moderation itself is done.

Facebook对规定手册的存在大体上持保密态度。但《纽约时报》(The New York Times)获取了该公司1400页的指导方针,发现问题不仅在于这些规定的起草方式,也在于审核方式。

Here are five takeaways from our story:

这里是我们报道的五大要点:

Facebook is experimenting on the fly.

Facebook摸着石头过河。

The rules are discussed over breakfast every other Tuesday in a conference room in Menlo Park, Calif. — far from the social unrest that Facebook has been accused of accelerating.

与Facebook加剧社会动荡的指控大相径庭的是,每隔一周周二在加利福尼亚州门洛帕克的一间会议室举行的早餐会上,Facebook都会讨论这些规定。

Though the company does consult outside groups, the rules are set largely by young lawyers and engineers, most of whom have no experience in the regions of the world they are making decisions about.

尽管该公司确实会咨询外部团体,但这些规定主要是由年轻的律师和工程师们来制定,他们中的大部分人对于决定所涉及的地区毫无经验。

The rules they create appear to be written for English speakers who at times rely on Google Translate. That suggests a lack of moderators with local language skills who might better understand local contexts.

他们制定的规定似乎是为讲英语、偶尔依赖谷歌翻译(Google Translate)的人所撰写的。这意味着缺乏拥有当地语言技能的审核员,而这种人可能会更好地理解当地背景。

Facebook employees say they have not yet figured out, definitively, what sorts of posts can lead to violence or political turmoil. The rulebooks are best guesses.

Facebook雇员表示,他们绝对尚未搞清楚什么样的帖子会导致出现暴力行为或政治动荡。这些规定手册只是可能性最大的猜测。

The rules contain biases, gaps and errors.

这些规定包含偏见、缺漏和错误。

Some of the rules given to moderators are inaccurate, outdated or missing critical nuance.

一些审核员收到的规定不正确、过时或未抓到重要的微妙细节。

One presentation, for example, refers to the Bosnian war criminal Ratko Mladic as a fugitive, though he was arrested in 2011.

比如,在一份简报中,称波斯尼亚战犯拉特科·姆拉迪奇(Ratko Mladic)在逃,但他已于2011年被捕。

Another appears to contain errors about Indian law, advising moderators that almost any criticism of religion should be flagged as probably illegal. In fact, criticizing religion is only illegal when it is intended to inflame violence, according to a legal scholar.

另一份简报中,似乎包含关于印度法律的错误,它建议审核员,几乎所有对宗教的批评都应该被标记为可能违法。实际上,根据一名法律学者的说法,只有在意图煽动暴力时,批评宗教才是非法的。

In Myanmar, a paperwork error allowed a prominent extremist group, accused of fomenting genocide, to stay on the platform for months.

在缅甸,一处文书错误导致一个被控挑起种族大屠杀的著名极端组织在该平台上存在了好几个月。

The moderators feel overwhelmed.

审核员觉得不堪重负。

Facebook outsources moderation to companies that hire the thousands of workers who enforce the rules. In some of these offices, moderators say they are expected to review many posts within eight to 10 seconds. The work can be so demanding that many moderators only last a few months.

Facebook将审核工作外包给了其他公司,这些公司会雇佣数以千计的工作人员执行这些规定。在这样的一些地方,审核员表示公司希望他们在8至10秒内审阅大量帖子。这项工作要求如此之高,许多审核员只能坚持短短几个月。

The moderators say they have little incentive to contact Facebook when they run across flaws in the process. For its part, Facebook largely allows the companies that hire the moderators to police themselves.

审核员们表示,在审核过程中发现规定存在缺陷时,他们没有什么动力去联系Facebook。就其本身而言,Facebook主要让雇佣这些审核员的公司进行自我管束。

Facebook is edging into countries’ politics.

Facebook正在慢慢介入各国政治。

Facebook is growing more assertive about barring groups and people, as well as types of speech, that it believes could lead to violence.

对于禁止那些被认为会导致出现暴力行为的群组、人和言论类型,Facebook变得愈发独断。

In countries where the line between extremism and mainstream politics is blurry, the social network’s power to ban some groups and not others means that it is, in essence, helping pick political winners and losers.

在那些极端主义和主流政治之间界限模糊的国家,这个社交网络封禁某些群组、放过另一些群组的权力,意味着它实际上在帮助挑选政治赢家和输家。

Sometimes it removes political parties, like Golden Dawn in Greece, as well as mainstream religious movements in Asia and the Middle East. This can be akin to Facebook shutting down one side in national debates, one expert argues.

有时候,它会移除政党,比如希腊的金色黎明党(Golden Dawn),还会移除亚洲和中东的主流宗教运动。一名专家认为,这类似于Facebook停掉了全国辩论中的一方。

Some interventions are more subtle. During elections in Pakistan, it told moderators to apply extra scrutiny to one party, but called another “benign.”

某些介入较为难以察觉。在巴基斯坦大选期间,Facebook告诉审核员要对某个政党进行特别细致的审核,但称另一个政党是“良性“的。

And its decisions often skew in favor of governments, which can fine or regulate Facebook.

此外,Facebook的决定往往会偏向能够对其进行罚款或监管的政府。

Facebook is taking a bottom-line approach.

Facebook正在采取底线策略。

Even as Facebook tries to limit dangerous content on its platform, it is working to grow its audience in more countries.

即使Facebook试图在其平台上限制危险内容的时候,它也仍在努力在更多国家发展更多用户。

That tension can sometimes be seen in the guidelines.

这种紧张关系有时候可以在这些指导方针中体现出来。

Moderators reviewing posts about Pakistan, for example, are warned against creating a “PR fire” by doing anything that might “have a negative impact on Facebook’s reputation or even put the company at legal risk.”

例如,在审核关于巴基斯坦的帖子时,审核员们得到警告,让他们不要做任何会“对Facebook的名声构成负面影响,甚至是让该公司面临法律风险“的事情,避免制造“公关火灾“。

And by relying on outsourced workers to do most of the moderation, Facebook can keep costs down even as it sets rules for over two billion users.

此外,通过依赖外包工作人员进行大部分审核工作,即使是在为20多亿用户制定规则,Facebook仍可将成本压低。

作者:Aodhan Beirne

翻译:安妮

声明:本文来自NYT教育频道,版权归作者所有。文章内容仅代表作者独立观点,不代表安全内参立场,转载目的在于传递更多信息。如有侵权,请联系 anquanneican@163.com。