Search

A former Facebook moderator says she had to review 8000 posts a day and it made her numb to child porn

Sarah KatzFormer Facebook moderator Sarah Katz.Sarah Katz

  • Former Facebook moderator Sarah Katz has described the pressure she was under to review thousands of posts a day from the darkest corners of the platform.
  • Katz told Business Insider that she eventually became desensitized to child porn and bestiality because she saw so much of it.
  • Her experience raised questions about Facebook's efforts to stamp out child exploitation.
  • But the 27-year-old was proud of her work and urged tech firms to do more to make moderation a career.
  • Facebook said it recognized that moderation can be difficult, but said the employees "play a crucial role in helping create a safe environment for the Facebook community."


A former Facebook moderator has said the pressure to churn through a never-ending pile of disturbing material eventually made her desensitized to child pornography and bestiality.

Sarah Katz worked as a content reviewer at Facebook's Menlo Park headquarters through a third-party contractor, Vertisystem, for eight months in 2016. Her job was simple: Figuring out if posts reported to Facebook broke the company's detailed community standards.

Practically, this meant eyeballing new and potentially horrific material every 10 seconds, and making a snap decision over whether it needed to be ditched. Posts that needed reviewing were called "tickets" and there were around 8,000 every day.

To deal with this onslaught, Facebook had 4,500 moderators like Katz on its books last year, and announced plans to hire another 3,000 by 2018 in the fight against the darkest corners of its user output. Facebook is also investing in AI to help police posts that break its rules.

Facebook detailed the scale of its problem with prohibited content in a transparency report in May. It took down 21 million posts containing nudity and sexual activity in the first three months of 2018, and 3.4 million featuring graphic violence. Millions of hate speech, spam and terror posts were also removed.

Reviewers have to sign a waiver document about offensive material

Facebook waiverThe waiver Sarah Katz signed before joining Facebook.Sarah Katz

Content reviewers begin their Facebook journey by signing a waiver document. This basically acknowledges that they are braced to view the disturbing material. It also protects Facebook from any potential legal action.

The one Katz signed is available on the right. It warns that moderators will be exposed to material that "may be offensive to some people," including pornographic images. It adds that staff should "promptly notify" Facebook if they "do not wish to continue."

"Facebook has billions of users and people don't know how to use the platform correctly. So there was a lot of pornography, bestiality, graphic violence," Katz told Business Insider. "There was a lot of content that you might not expect to see shared on Facebook."

She worked in an open plan office in Menlo Park, where free snacks flowed and there was a reasonable camaraderie among her colleagues. They would set to work on their queue of posts for review, and when in full flow, Katz said she made decisions within seconds.

If ticket targets were not met, there would be consequences. Failing to hit a goal "once or twice" would result in a warning, Katz said. More than three times, "you would probably get let go." Katz never witnessed this, but said it was informally known among members of staff.

"It's kind of a monotonous job after a while. You definitely grow desensitized to some of the graphic material because you see so much of it. A lot of the content tends to recirculate," she said.

A sinister image which kept resurfacing

Katz said there was a particularly sinister photo and video that popped up repeatedly in her queue.

It featured two children — aged between nine and 12 — standing facing each other, wearing nothing below the waist, and touching each other. It was clear, Katz said, that there was someone behind the camera telling them what to do.

"It would go away and come back, it would appear at multiple times of the day. Each time the user location would be different. One day shared from Pakistan, another day the US. It's kinda hard to track down the initial source," she continued.

At the time, Katz said she was not asked to report the accounts sharing the material — a fact that "disturbed" her. "If the user's account was less than 30 days old we would deactivate the account as a fake account. If the account was older than 30 days we would simply remove the content and leave the account active," she said.

Facebook Menlo ParkInside Facebook's Menlo Park headquarters.Reuters

Her experience raises questions about the effectiveness of Facebook's efforts to tackle child exploitation.

The company signed a deal with Microsoft in 2011 to use its PhotoDNA technology. This scans all images on Facebook and Instagram, flags known child porn, and prevents it from being uploaded again. Furthermore, Facebook moderators are trained to recognise and escalate child porn internally when they see it.

The firm told the New York Post in 2012 that it reports "all" instances of child exploitation to America's National Center for Missing and Exploited Children. "We have zero tolerance for child pornography being uploaded onto Facebook and are extremely aggressive in preventing and removing child exploitative content," the company said at the time.

Katz was not aware any of this was in place in 2016. "Facebook might have a policy [now] where you're supposed to report it, but back then they didn't," she said. 

Facebook declined to comment on the discrepancy between Katz' account and its stated policies. 

The nuance of human perspective

Katz is now a cybersecurity analyst for cloud computing firm ServiceNow and has written a sci-fi novel, titled "Apex Five," which draws inspiration from her time at Facebook. She is broadly upbeat about her Facebook experience, arguing that the downsides of the job were outweighed by the sense that she was protecting users.

"There needs to be many human eyes to do the job and it cannot all be allocated to artificial intelligence, no matter how much folks say," she said. "We will always be that nuance of human perspective. AI would help to track all that content from billions of users. The human element has to be taken care of, so we keep doing a good job."

Mark Zuckerberg (Jake Kanter using for post)Facebook CEO Mark Zuckerberg.Reuters

The 27-year-old urged tech firms to do more to make moderation a career, rather than a short-term job. She said: "It behoves not only Facebook but social media platforms in general to hire content moderators on a full-time basis because provides much more incentive. It really incentivises us to do a stellar job and make it something we want to stick with, rather than winging it holding out for something better."

A Facebook spokeswoman said: "Our global team of content reviewers play a crucial role in helping create a safe environment for the Facebook community, which is our number one priority. We recognise that this work can often be difficult, so we have wellness and psychological support in place for all our staff.

"It’s a really big job with more than 2 billion people on Facebook, so we recently added 3,000 people to our existing 4,500 strong community operations team around the world to review the millions of reports we get every week."

Let's block ads! (Why?)

Read again A former Facebook moderator says she had to review 8000 posts a day and it made her numb to child porn : https://ift.tt/2MUK9iY

Let's block ads! (Why?)



Bagikan Berita Ini

0 Response to "A former Facebook moderator says she had to review 8000 posts a day and it made her numb to child porn"

Post a Comment

Powered by Blogger.