Facebook Adding 3,000 Workers To Monitor Crime And Suicide

Facebook CEO Mark Zuckerberg speaks at the company's headquarters in Menlo Park, Calif. where Facebook Inc. announced a partnership called Internet.org on Wednesday, Aug. 21, 2013.

Facebook CEO Mark Zuckerberg speaks at the company’s headquarters in Menlo Park, Calif. Aug. 21, 2013.

Facebook’s latest response to the public outcry that resulted from the recent live streaming of a murder on the social network, was announced moments ago when on his personal blog, Mark Zuckerberg announced that over the next year, Facebook “will be adding 3,000 people — on top of the 4,500 we have today — to review the millions of reports we get every week and improve the process for doing it quickly.”

In addition to violence, the crackdown also targets “hate speech and child exploitation.”

Zuckerberg also notes that “we’ll keep working with local community groups and law enforcement who are in the best position to help someone if they need it — either because they’re about to harm themselves, or because they’re in danger from someone else.”

To all those who are about to spend the next few years watching videos of crime and suicide all day, every day, our condolences. As for whether Facebook will be able to eradicate violence, hate speech, and aggression, we doubt we are the only ones who are skeptical, and furthermore, we wonder when advertisers will express a similar sentiment.

Zuckerberg’s full blog post:

Over the last few weeks, we’ve seen people hurting themselves and others on Facebook — either live or in video posted later. It’s heartbreaking, and I’ve been reflecting on how we can do better for our community.

If we’re going to build a safe community, we need to respond quickly. We’re working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down.

Over the next year, we’ll be adding 3,000 people to our community operations team around the world — on top of the 4,500 we have today — to review the millions of reports we get every week, and improve the process for doing it quickly.

These reviewers will also help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation. And we’ll keep working with local community groups and law enforcement who are in the best position to help someone if they need it — either because they’re about to harm themselves, or because they’re in danger from someone else.

In addition to investing in more people, we’re also building better tools to keep our community safe. We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help. As these become available they should help make our community safer.

This is important. Just last week, we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren’t so fortunate.

No one should be in this situation in the first place, but if they are, then we should build a safe community that gets them the help they need.


© ZeroHedge.com

The post Facebook Adding 3,000 Workers To Monitor Crime And Suicide appeared first on MintPress News.

This BBSNews article was syndicated from MintPress News, and written by ZeroHedge.com. Read the original article here.