How Facebook Decides If You See Nudity or Death (HBO)

What do you think this How Facebook Decides If You See Nudity or Death (HBO) video?

How Facebook Decides If You See Nudity or Death (HBO)

Facebook employs 4,500 content moderators around the world. Moderators get two weeks of training and a stack of manuals to help them police the site for racism, misogyny, violence, and pornography.

VICE’s partners at The Guardian obtained more than a hundred of these manuals and they offer the first-ever look at the sometimes logical, sometimes inexplicable ways Facebook asks a few thousand people to help patrol its close to 2 billion users.

This segment is part of the May 23rd VICE News Tonight episode.

Watch VICE News Tonight on HBO Mondays through Thursdays at 7:30 PM ET.

Subscribe to VICE News here: bit.ly/Subscribe-to-VICE-News

Check out VICE News for more: vicenews.com

Follow VICE News here:
Facebook: www.facebook.com/vicenews
Twitter: twitter.com/vicenews
Tumblr: vicenews.tumblr.com/
Instagram: instagram.com/vicenews
More videos from the VICE network: www.fb.com/vicevideo

Share This Page