Martes, 25 Setiembre, 2018

Facebook steps up efforts to curb violent videos

Facebook is adding 3,000 more people to its team to monitor the site for violent videos.                  Facebook Facebook is adding 3,000 more people to its team to monitor the site for violent videos. Facebook
Eleena Tovar | 04 May, 2017, 16:52

In a Wednesday morning post on the social media platform he created, Zuckerberg promised that the company is working on "do [ing] better for our community".

Two months ago Facebook announced tools it hoped would prevent people from killing themselves on live videos (see "Big Questions Around Facebook's Suicide Prevention Tools"). "And we'll keep working with local community groups and law enforcement who are in the best position to help someone if they need it - either because they're about to harm themselves, or because they're in danger from someone else", he said in the Facebook post.

The move comes after Facebook Live, the company's popular video-streaming service, has been used recently as a platform to broadcast a series of frightful acts to viewers, including a man boasting about his apparently random killing of a Cleveland man and the murder of an infant in Thailand.

Such moves may not be enough to stanch the flow of violent videos-both streamed live via Facebook Live and in those that are recorded and then uploaded-that are posted to the site, though.

If, as may be the case, the workers Facebook intends to hire for the sole objective of removing its most upsetting videos are contractors rather than staffers, the company may not be obligated to provide health insurance, let alone counselling. "Instead of building walls, we can help build bridges", he said, as he explained that connecting the world is key to Facebook's future. Now, the company is finally taking some more concrete steps to help curb the problem.

In January, four African-Americans in Chicago were accused of attacking an 18-year-old disabled man on Facebook Live while making anti-white racial taunts. It never takes very long for a product sitting in front of a billion and a half people to be used to spread graphic violence, and Facebook, in seeking to make broadcasting easy and spontaneous, enables this behavior.

He added: Just last week, we got a report that someone on [Facebook] Live was considering suicide.

Zuckerberg said Facebook workers review "millions of reports" every week.

In most cases, content is reviewed and possibly removed only if users complain. Zuckerberg acknowledged that the world's largest social network had a role to play in stemming the worrisome trend.

Adding moderators - if we make the generous assumption that they're properly compensated and protected - will help fix the problem.

Despite its efforts, Facebook's become the subject of recent controversy.