logo 首頁 > 文匯報 > 百搭通識 > 正文

【News Buddy】fb審查指引公開 自殺自殘影片可過關

2017-06-05
■fb的內部審查指引公開後,惹來各方質疑。 資料圖片■fb的內部審查指引公開後,惹來各方質疑。 資料圖片

【原文】下文摘錄自香港《文匯報》2017年5月23日報道:

英國《衛報》(The Guardian)取得全球擁有20億用戶的社交網站facebook的內部文件,公開fb就如何管理帶有歧視(discrimination)、色情(sex)、暴力(violence)等內容的指引。不過,部分指引被形容為前後矛盾(contradictory)、難以理解,界線(boundary)如何定斷惹來各方質疑。

保護權貴 自殺片「提高意識」

有關文件包括逾100份內部培訓手冊(training manual)、電子表格(spreadsheet)和流程圖(flowchart),指引範疇甚至涉及比賽造假(match-fixing)及食人肉(cannibalism)等情況。文件顯示,由於美國總統特朗普(Donald Trump)為國家元首,屬於受保護範圍,當有人在fb上表示要「槍斃特朗普」,內容均會被刪除。用戶卻可表達「折斷那壞女人的頸」及「滾開,去死吧」,這些言論不被視為實質威脅(threatening)。

至於牽涉有人死於非命,包括意外(accident)、自殺(suicide)及謀殺(murder)的影片,雖然會令人不安(disturbing),但不一定會被刪除,因fb認為這可提高人們對這些突發事件的意識(awareness),例如精神病(mental illness)引發自殺的問題。至於虐打和欺凌兒童(child abuse)的照片,除非牽涉性虐待(sexual abuse),否則未必會刪除。

fb:有灰色地帶 管理員不夠

fb亦容許用家在網上直播(livestream)自殘(self-harm/self-injury)過程,原因是不想審查(censor)或懲罰受困擾的人,加上專家曾建議,自殘人士一直和觀眾在網上互動,可以確保安全。自殘影片會引發別人仿效(imitation),若無法幫助他們打消念頭,為顧及觀眾安危,也會刪除影片,但影片具新聞價值(newsworthy)則另當別論。

文件顯示,管理人員曾在兩周內上報4,531宗自殘個案。

fb在文件中承認,部分網民使用強烈措辭以表達不滿情緒(disagreement),並認為不用就言論負責,即使向別人作出威脅也當作平常。fb維護指,用戶數目龐大,難以就他們的內容作分類,必然有灰色地帶(grey area),故會致力提供安全的網絡環境(Internet/online safety),就違規內容作出舉報的人也會被授予更大權力(empower)。

此外,管理人員每周要處理650萬個假賬號(fake account)的舉報,使他們疲於奔命(exhausted),經常只有10秒時間決定內容去留。公司現有4,500名內容管理人員,並計劃增聘(recruit)3,000人。

fb internal guidelines leaked: suicide clips may pass

【譯文】The internal guidelines on discrimination, sex and violence of facebook-the social media giant that has over two billion users-have been revealed for the first time by the British daily newspaper The Guardian recently.

However, part of the rules has been deemed contradictory and confusing, arousing a public debate on how to draw appropriate boundaries in real practice.

"Get out and die" is fine

The related documents include over 100 internal training manuals, spreadsheets and flowcharts, with guidelines covering even the contents on match-fixing and cannibalism. One of the rules specifies that U.S. President Donald Trump, as a head of state, would be put in a protected category and any remark such as "to shoot Trump" should be deleted immediately. But strangely, violent language such as "to snap a woman's neck" and "get out and die" would not be regarded as threatening.

Videos of violent death of those involving accident, suicide and murder, while disturbing, might not be removed automatically because they may help raise public awareness of the emergencies-drawing people's attention to the problem of suicide caused by mental illness, for example. Images of child abuse might not be deleted as well, unless evidence of sexual abuse is detected.

fb also allows users to livestream attempts to self-mutilate because the company "doesn't want to censor or punish people in distress", and it believes that interaction with the audience can help ensure the safety of self-mutilators, as suggested by professionals.

fb: grey areas exist, team short of hands

However, concerning that such videos might invite imitation, if the live broadcasting cannot help relieve the misery and may threaten the safety of the audience, they would be removed eventually, while newsworthy ones are to be treated differently. The documents have disclosed that 4,531 cases of self-injury used to be reported in just two weeks.

The internal rulebook has added that some users often express disagreement and anger by threatening or calling for violence, who deem such kind of expressions normal and find it out of question to claim any responsibility.

The company said that given such a huge mass of users it would be difficult to clearly categorize the content of expressions, and therefore grey areas probably exist. But it would continue to invest in keeping online safety and plan for empowering people who actively report inappropriate content.

Moreover, the company's management team needs to handle up to 6.5 million cases of fake account every week, making the staff so exhausted that they often make decisions in only 10 seconds of whether to delete the content .

There are currently 4,500 staff members in the team and the company plans to recruit 3,000 more.■龐嘉儀

Q&A

1. 現時「網絡欺凌」日趨普遍,它的英文是?

2. facebook對網絡內容的處理手法引起對「媒體社會責任」的反思,它的英文是?

3. 包括facebook在內的一些網絡媒體都曾涉及用戶資料外洩,引發對什麼議題的關注?

4. 承上題,在香港,哪個政府機關負責處理相關問題?

5. 為保護個人資料私隱,內地最近實施了什麼新法例?

1. Cyber-bullying

2. Social responsibility of mass media

3. 個人資料私隱(personal data privacy)

4. 個人資料私隱專員公署(Privacy Commissioner for Personal Data)

5. 《網絡安全法》

讀文匯報PDF版面

新聞排行
圖集
視頻