Market Overview

What Goes Into Facebook's Massive Task To Limit Inappropriate Content?

Share:
What Goes Into Facebook's Massive Task To Limit Inappropriate Content?
Related FB
Who Is Paul Meeks, And Why Should You Care That He's Bullish On The FANG Stocks?
Cramer Finds A Way To Explain Stock Diversification With Fantasy Football
The Vetr community has downgraded $FB to 3-Stars (Vetr)

Facebook Inc’s (NASDAQ: FB) head of global policy management, Monika Bickert, posted Tuesday on the company’s newsroom site a defense and explanation of its behavior with regard to user-made content review.

Facebook often comes under fire for its censorship policies, both from people who demand more safeguards and people who see them as an assault on free speech.

Bickert’s post elaborates on just how massive the task of reviewing content on the site is.

The Essential Challenge

Over one billion people use Facebook each day, posting in dozens of languages, and in a number of forms ranging from text to live video.

The job of reviewing this content isn’t limited to normal business hours either; users are active at all times in every time zone. These factors create a challenge that Facebook employees struggle to tackle in volume (millions of reports are filed a week), let alone consideration of a post’s content.

“The range of issues is broad — from bullying and hate speech to terrorism — and complex,” wrote Bickert. For reviewers, the major hurdle they face is understanding the context of a post.

See Also: Mark Zuckerberg Talks About Monitoring Disturbing Content On Facebook

Facebook has to be conscious of the relevant censorship laws in every country it operates in. Bickert used the example of criticizing a monarchy. In the United Kingdom, it's often acceptable, but in other countries it’s an offence carrying jail time.

Other legal concerns include distinguishing between art and pornography. Even in the United States, this is a legal grey area, perhaps best represented by Supreme Court Justice Potter Stewart describing his threshold for obscenity, which today is still the main bar: “I shall not today attempt further to define the kinds of [pornography]... But I know it when I see it.”

Other reports deal with issues not addressed by legal definitions, such as the difference between expressions of anger and calls for a person to be harmed, or when a joke about suicide is actually a call for help.

“Being as objective as possible is the only way we can be consistent across the world. But we still sometimes end up making the wrong call,” wrote Bickert.

The Policy

Facebook purposely doesn't elaborate on the details of its content reviewing policies, as doing so might allow people to find workarounds.

Instead, Facebook encourages its users to read and understand its Community Standards. The company also believes that, although they can still be improved, its current policies reflect a fair balance between censorship and freedom of speech.

“We face criticism from people who want more censorship and people who want less. We see that as a useful signal that we are not leaning too far in any one direction,” wrote Bickert.

The Quest For Objectivity

Last month, Facebook announced it would be hiring 3,000 more reviewers to tackle the problem. These reviewers will likely include former criminal prosecutors — like Bickert herself — social workers, counter-terrorism researchers, teachers, and others from a wide range of fields concerned with content on social media.

Bickert also pointed out that Facebook is “in constant dialogue with experts and local organizations, on everything from child safety to terrorism to human rights.”

Citing that no two people have identical views of what defines areas of concern, such as hate speech, Bickert explained that reviewers are trained using intentionally extreme hypothetical situations. Facebook has also created its own clear definitions of these areas.

“We believe the benefits of sharing far outweigh the risks,” Bickert closed. “But we also recognize that society is still figuring out what is acceptable and what is harmful, and that we, at Facebook, can play an important part of that conversation.”

Posted-In: Monika BickertEducation Psychology Crowdsourcing Tech General Best of Benzinga

 

Related Articles (FB)

View Comments and Join the Discussion!