Facebook is all set to use its artificial intelligence to detect whether someone is expressing thoughts of suicide in a post or live video. The company has been brainstorming with how to respond to suicides streamed live on the world’s largest social network.
Family and friends of some suicide victims have criticized Facebook for not pulling down videos depicting self-harm quickly enough before they go viral. According to Facebook, it is stepping up efforts to prevent people from killing themselves. More workers are also reviewing reports of suicide and self-harm and the company said it’s improving how it identifies first responders.
This helps the company, which has 2 billion users, figure out what reports need to be prioritized. Through the use of AI, Facebook is looking for patterns that signal a user might be suicidal. Comments such as “Are you OK?” and “Can I help?” are phrases that can indicate someone is in danger.
Post Your Comments