Miércoles, 18 Octubre, 2017

Facebook to add 3000 workers to fight streaming of live violence

Facebook to Add 3000 Workers to Monitor Live Video, Other Posts Facebook to hire 3000 to review videos of crime and suicide
Manuel Armenta | 04 May, 2017, 17:40

The 3,000 new recruits, added over the coming year, will increase by two thirds the size of Facebook's community operations team, which now numbers 4,500.

Chief executive Mark Zuckerberg said the company was "working to make these videos easier to report so we can take the right action sooner - whether that's responding quickly when someone needs help or taking a post down".

Videos and posts that glorify violence are against Facebook's rules, but it has drawn criticism for responding slowly to such items, including video of a slaying in Cleveland and the live-streamed killing of a baby in Thailand.

Reports and internal documents have surfaced that suggest Facebook can use the language you use on your own posts to target ads toward you - particularly when you express anxiety or other negative emotions. Facebook Live, a service that allows any user to broadcast live, has been marred since its launch previous year by instances of people streaming violence.

Critics say the social network has been too slow to react to online violence, and questioned whether Facebook Live - a strategic area of development for the company - should be disabled, after several cases in which it was used to broadcast rapes. The Thailand video was up for 24 hours before it was removed.

Sarah T. Roberts, an assistant professor at UCLA who studies online content moderation, calls the additional hires a "drop in the bucket" given how much content Facebook's almost two billion users share. For Facebook, a social giant of 1.94 billion users, using its power to simply remove violent content is not enough.

Facebook is due to report quarterly revenue and earnings later on Wednesday after markets close in NY.

"It's heartbreaking, and I've been reflecting on how we can do better for our community", Zuckerberg wrote on Wednesday about the recent videos.

In January, four people who were later charged with hate crimes were watched by horrified viewers in Chicago as they appeared to beat and torture a mentally disabled man.

Following the incident, Facebook said "we know we need to do better" in reviewing and removing violent content. Zuckerberg acknowledged that the world's largest social network had a role to play in stemming the worrisome trend. "We immediately reached out to law enforcement, and they were able to prevent him from hurting himself", said Mr Zuckerberg.

Andrea Saul, a spokeswoman for Facebook, declined to comment beyond the post.

Zuckerberg said the company would keep working with community groups and law enforcement, and that there have been instances when intervention has helped. "We're just not there yet technologically", said Sarah Roberts, a professor of information studies at UCLA who looks at content monitoring.