Facebook is using artificial intelligence to track posts – and the reason could save your life
New 'wellness checks' are part of a fresh objective to identify users who need help.
Sometimes, help comes too late for people who are thinking about harming themselves. A new tool powered by artificial intelligence aims to change that.
With the health and safety of its more than 2 billion users in mind, Facebook announced it is taking a preemptive approach to user tracking. Instead of waiting for a friend or commenter to report troubling behavior, Facebook is launching "proactive detection," which basically means its technology will be able to flag certain language or photos that indicate that a user might be at risk of suicide.
“This is about shaving off minutes at every single step of the process, especially in Facebook Live,” says Guy Rosen, Facebook's Israel-born vice president of product management who has helped develop and research the technology. “There have been cases where the first responder has arrived and the person is still broadcasting.”
Rosen acknowledged that Facebook is already a central meeting place for many friends and family, so making sure they can reach someone in distress – and support them – is an obvious and essential component of the social media giant's capabilities.
"We use signals like the text used in the post and comments (for example, comments like 'Are you ok?' and 'Can I help?' can be strong indicators)," explained Rosen, who studied physics at the Hebrew University of Jerusalem. "In some instances, we have found that the technology has identified videos that may have gone unreported."
Once the most concerning content has been identified, the technology then goes to work helping to speed up response time. For example, Rosen said, Facebook's reviewers can "quickly identify which points within a video receive increased levels of comments, reactions and reports from people on Facebook." The reviewers can then determine whether someone may be in distress and get them help.
When it comes to identifying suicidal behavior on social media, every second counts. (Photo: Nevodka / Shutterstock)
The network already has an approach in place to make it easy for users to report troubling behavior, even offering people specific language to use when responding to this behavior, as well as a list of resources to suggest for people to help themselves. With the new A.I. functionality, Rosen explained, that support goes a step further, and will very likely save lives.
"This puts Facebook in a really unique position," he said. "We can help connect people who are in distress connect to friends and to organizations that can help them.”
MORE FROM THE GRAPEVINE: