Facebook Moderation Firm Calls it Quits

GEC: Discuss gaming, computers and electronics and venture into the bizarre world of STGODs.

Moderator: Thanas

Post Reply
User avatar
SolarpunkFan
Jedi Knight
Posts: 586
Joined: 2016-02-28 08:15am

Facebook Moderation Firm Calls it Quits

Post by SolarpunkFan »

Futurism
Moderating content for Facebook is traumatic. That’s not an opinion — it’s a fact.

Thousands of people spend their work days deciding whether posts violate Facebook’s content policies. And a growing number have spoken to the media about the terrible toll of seeing countless images and videos depicting violence, sex abuse, child pornography, and torture. In March 2018, one moderator in Tampa, Florida, actually died right at his desk.

That man, Keith Utley, was employed by a firm called Cognizant, which reportedly signed a two-year, $200 million contract with Facebook to keep the platform free of objectionable content — and, in a huge blow to Facebook’s moderation strategy, it just announced it’ll cut ties with the social media company when that contract runs out.

“We have determined that certain content work in our digital operations practice is not in line with our strategic vision for the company, and we intend to exit this work over time,” Cognizant told BBC News. “This work is largely focused on determining whether certain content violates client standards — and can involve objectionable materials.”

“In the meantime, we will honor our existing obligations to the small number of clients affected and will transition, over time, as those commitments begin to wind down,” the firm later added. “In some cases, that may happen over 2020, but some contracts may take longer.”

BBC News wrote that the decision will lead to the loss of an estimated 6,000 jobs and affect both the Tampa moderation site and one in Phoenix, Arizona.

“We respect Cognizant’s decision to exit some of its content review services for social media platforms,” Facebook’s Arun Chandra told BBC News. “Their content reviewers have been invaluable in keeping our platforms safe — and we’ll work with our partners during this transition to ensure there’s no impact on our ability to review content and keep people safe.”

Cognizant wasn’t Facebook’s sole source of content moderators — the company has 20 review sites employing approximately 15,000 people across the globe. But even that army of moderators hasn’t been enough to prevent policy-violating content from slipping through the cracks.

Perhaps most notably, an Australian man used Facebook to livestream an assault on two mosques in New Zealand in March that led to the deaths of 51 people. Not only did the bloody video rack up thousands of views before Facebook’s moderators took it down, but the company struggled to remove copies of the footage from its platform in the aftermath of the slaughter.

If Facebook considers its platform “safe” now, it’s hard to imagine what it could look like if the social network doesn’t quickly replace the Cognizant employees currently comprising more than a third of its moderation work force.

Editor’s Note: This article was updated to correct the citizenship of the man who attacked the New Zealand mosques.
More issues regarding Farcebook. How did a day ending in "y" come so soon again? :roll:
Seeing current events as they are is wrecking me emotionally. So I say 'farewell' to this forum. For anyone who wonders.
User avatar
Ace Pace
Hardware Lover
Posts: 8456
Joined: 2002-07-07 03:04am
Location: Wasting time instead of money
Contact:

Re: Facebook Moderation Firm Calls it Quits

Post by Ace Pace »

You mean "more issues when running internet scale communities". Google, Twitter, Reddit, Facebook along with the chinese social media companies, all have the same issues.

When you connect more than 2 billion people, you get horrible people. When you have no firm definition of what counts as bad content, stuff slips through.

A humane computer based alternative does not exist right now.

So either we need to dump the assumption that people will be nice online and curate everything, killing every user content driven website on earth, or need to figure out how we live with the social costs of this activity.
Brotherhood of the Bear | HAB | Mess | SDnet archivist |
User avatar
Xisiqomelir
Jedi Council Member
Posts: 1757
Joined: 2003-01-16 09:27am
Location: Valuetown
Contact:

Re: Facebook Moderation Firm Calls it Quits

Post by Xisiqomelir »

re: paid moderator mental health and having fewer people die at their desks...

How about an automosaic/blur filter for all of this unfiltered content? On the various UNIX/UNIX-like systems some hideous offensive raw user content like:

Image

would come in and the content moderation could do something like

Code: Select all

convert hideous_racist_guro_pornographic_rawimage.png -blur 0x8 sanitized_SFW_cleanimage.png
yielding this hopefully less mentally traumatic image for the mod drones:

Image

Then if it wasn't very clear from the blur if they should reject, they could be given a dial to progressively unblur in their workstation interfaces.

Boom, drastic drop in staff depression/death rates, probably a jump in productivity since you can click through blurs without taking a 2 minute retching break, and some :wanker: PR you can drop about how your miserable corporate hive cares sooo much about worker mental health!

BRB pitching $500,000 tender to Cognizant!
User avatar
Starglider
Miles Dyson
Posts: 8709
Joined: 2007-04-05 09:44pm
Location: Isle of Dogs
Contact:

Re: Facebook Moderation Firm Calls it Quits

Post by Starglider »

That has been tried along with converting content to greyscale. It helps a little bit.
User avatar
SolarpunkFan
Jedi Knight
Posts: 586
Joined: 2016-02-28 08:15am

Re: Facebook Moderation Firm Calls it Quits

Post by SolarpunkFan »

Xisiqomelir wrote: 2020-01-05 08:31amSnip
From what I gather the big hurdle is that computer vision is still nowhere near the level of your average human. Progress has been made, but we're still in the "Kitty Hawk" era when what's needed is a Stratolaunch aircraft.
Seeing current events as they are is wrecking me emotionally. So I say 'farewell' to this forum. For anyone who wonders.
User avatar
LadyTevar
White Mage
White Mage
Posts: 23423
Joined: 2003-02-12 10:59pm

Re: Facebook Moderation Firm Calls it Quits

Post by LadyTevar »

I was an AOL Chatroom mod (volunteer/unpaid) back in the day. It was not easy back then, keeping chatrooms clear of interruption, especially the busy ones. You were expected to Announce Yourself, Warn Them to Stop TWICE, and only THEN were you allowed to use the Block/Ban codes. Which really didn't work well when said a-hole was typing offensive words as fast as they could type (or copy/pasting).

Now, there's photos, there's gifs, there's memes, and they're shared and reposted, and commented on. I'm amazed the company is able to do as much filtering as they have been.
Image
Nitram, slightly high on cough syrup: Do you know you're beautiful?
Me: Nope, that's why I have you around to tell me.
Nitram: You -are- beautiful. Anyone tries to tell you otherwise kill them.

"A life is like a garden. Perfect moments can be had, but not preserved, except in memory. LLAP" -- Leonard Nimoy, last Tweet
Post Reply