RICK REA: Helping You Grow Through Online Marketing
  • Home
  • Blog
    • Social Media News
    • SEO Marketing News
    • Digital Trends News
    • Photography News
    • Mobile Marketing
    • Business News
    • Gadget News
    • Printing News
  • Contact
  • About
  • Subscribe


Facebook's AI suicide prevention tool can save lives but the company won't say how it works

11/28/2017

0 Comments

 
http://ift.tt/2BumFwl

Facebook's AI suicide prevention tool can save lives, but the company won't say how it works

http://ift.tt/2AehpPl

For many people who've dedicated their lives to preventing suicide, social media posts can be a precious dataset that contains hints about what people say and do before they attempt suicide.  

In the past few years, researchers have built algorithms to learn which words and emoji are associated with suicidal thoughts. They've even used social media posts to retrospectively predict the suicide deaths of certain Facebook users. 

Now Facebook itself has rolled out new artificial intelligence that can proactively identify heightened suicide risk and alert a team of human reviewers who are trained to reach out to a user contemplating fatal self-harm. 

An example of what someone may see if Facebook detects they need help.

An example of what someone may see if Facebook detects they need help.

The technology, announced Monday, represents an unparalleled opportunity to understand and predict suicide risk. Before the AI tool was even publicly announced, Facebook used it to help dispatch first responders in 100 "wellness checks" to ensure a user's safety. The tool's life-saving potential is huge, but the company won't share many details about how it works or whether it'll broadly share its findings with academics and researchers. 

That is bound to leave some experts in the field confused and concerned. 

Munmun De Choudhury, an assistant professor in the School of Interactive Computing at Georgia Tech, commends the social media company for focusing on suicide prevention, but she would like Facebook to be more transparent about its algorithms. 

"This is not just another AI tool — it tackles a really sensitive issue," she said. "It’s a matter of somebody's life and death." 

"This is not just another AI tool — it tackles a really sensitive issue. It’s a matter of somebody’s life and death." 

Facebook understands the stakes, which is why its VP of product management, Guy Rosen, emphasized in an interview how AI significantly hastens the process of identifying distressed users and getting them resources or help. 

But he declined to talk in-depth about the algorithm's factors beyond a few general examples, like worried comments from friends and family, the time of day, and the text in a user's post. Rosen also said the company, which has partnerships with suicide-prevention organizations, wants to learn from researchers, but he wouldn't discuss how or if Facebook might publish or share insights from its use of AI. 

"We want to be very open about this," he said. 

While transparency might not be Facebook's strength, in a field like suicide prevention it could help other experts save more lives by revealing behavior or language patterns that emerge prior to suicidal thinking or a suicide attempt. With more than 2 billion users, Facebook arguably has the largest database of such content in the world. 

De Choudhury says transparency is vital when it comes to AI because transparency instills trust, a sentiment that's in short supply as people worry about technology's potential to fundamentally disrupt their professional and personal lives. Without enough trust in the tool, says De Choudhury, at-risk users may decide against sharing emotionally vulnerable or suicidal posts. 

When users receive a message from Facebook, it doesn't indicate that AI identified them as high risk. Instead, they're told that "someone thinks you might need extra support right now and asked us to help." That someone, though, is a human reviewer who followed up on the AI detection of risk.

It's also currently impossible to know how the AI determines that someone is at imminent risk, the algorithm's accuracy, or how it makes mistakes when looking for clues of suicidal thinking. Since users won't know they were flagged by AI, they have no way of telling Facebook that it wrongly identified them as suicidal. 

De Choudhury's research involves analyzing social media to glean information about people's mental and emotional wellbeing, so she understands the challenges of both developing an effective algorithm and deciding which data to publish. 

She acknowledges that Facebook must strike a delicate balance. Sharing certain aspects of its findings, for example, could lead users to oversimplify suicide risk by focusing on key words or other signals of distress. And it could potentially give people with bad intentions data points they could use to analyze social media posts, identify those with perceived mental health issues, and target them for harassment or discrimination. 

"I think sharing how the algorithm works, even if they don’t reveal every excruciating detail, would be really beneficial." 

Facebook also faces a different set of expectations and pressures as a private company. It may consider its suicide prevention AI tool intellectual property developed for the public good. It might want to use features of that intellectual property to enhance its offerings to marketers and advertisers; after all, pinpointing a user's emotional state is something that could be highly valuable to Facebook's marketplace competitiveness. The company has previously expressed interest in developing that ability. 

Whatever the case, De Choudhury argues that Facebook can still contribute to broader efforts to use social media to understand suicide without compromising people's safety and the company's bottom line. 

"I think academically sharing how the algorithm works, even if they don’t reveal every excruciating detail, would be really beneficial," she says, "...because right now it's really a black box."  

Crisis Text Line, which partnered with Facebook to provide suicide prevention resources and support to users, does use AI to determine people's suicide risk — and shares its findings with researchers and the public. 

"With the scale of data and number of people Facebook has in its system, it could be an incredibly valuable dataset for academics and researchers to understanding suicide risk," said Bob Filbin, ‎chief data scientist for ‎Crisis Text Line. 

Filbin didn't know Facebook was developing AI to predict suicide risk until Monday, but he said that Crisis Text Line is a proud partner and eager to work with the company to prevent suicide. 

Crisis Text Line trains counselors to deescalate texters from "hot to cool" and uses first responders as a last resort. Facebook's human reviewers confirm the AI's detection of risk by examining the user's posts. They provide resources and contact emergency services when necessary, but do not further engage the user. 

Did you know that self-harm is most frequently reported by texters 13 and younger? More insights and ways to help at https://t.co/ixEAAWHENT

— Crisis Text Line (@CrisisTextLine) August 16, 2017

Filbin expects Facebook's AI to pick up on different signals than what surfaces in Crisis Text Line's data. People who contact the line do so looking for help and therefore may be more explicit in how they communicate suicidal thoughts and feelings. 

One simple example is how texters at higher risk of suicide say they "need" to speak to a counselor. That urgency — compared to "want" — is just one factor that the line's AI uses to make a judgment about risk. Another is the word "ibuprofen," which Crisis Text Line discovered is 16 times more likely to predict the person texting needs emergency services than the word suicide. 

Filbin said that Crisis Text Line's algorithm can identify 80 percent of text conversations that end up requiring an emergency response within the first three messages.  

That is the kind of insight that counselors, therapists, and doctors hope to one day possess. It's clear that Facebook, by virtue of its massive size and commitment to suicide prevention, is now  leading the effort to somehow put that knowledge into the hands of people who can save lives. 

Whether or not Facebook accepts that position — and the transparency it requires — is a question the company would rather not answer yet. At some point, though, it won't have any other option.

If you want to talk to someone or are experiencing suicidal thoughts, text the Crisis Text Line at 741-741 or call the National Suicide Prevention Lifeline at 1-800-273-8255. Here is a list of international resources. 





Social Media

via Social Media http://ift.tt/1N1mMj1

November 28, 2017 at 10:31AM

0 Comments



Leave a Reply.


    Amazing WeightLoss

    Click Here!

    Categories

    All
    Analyze Top Competitors
    Anti-Abuse
    Apple
    Apple Watch
    Blog Posts
    Brainstorm
    Brand Awareness
    Communications
    Content Marketing
    Conversion Rates
    Editorial Calendar Tips
    Engagement
    Facebook
    Google Analytics
    How To Marketing Tips
    Influencer
    Instagram
    Instagram Live
    Keyword Search
    Marketing
    Marketing Automation
    Picture Quotes
    Podcasts
    Recording Videos
    Repurpose Blogs
    Research Trends
    Sales Funnel
    SEO Marketing
    Sharing Posts
    Slide Shows
    Smartwatch
    Social Media Marketing
    Social Media News
    Social Media Tools
    Social Selling
    Target Marketing
    Twitter
    Twitter Notifications
    User Interaction
    Video
    Video Marketing

    Archives

    November 2020
    October 2020
    September 2020
    August 2020
    July 2020
    June 2020
    May 2020
    April 2020
    March 2020
    February 2020
    January 2020
    December 2019
    November 2019
    September 2019
    August 2019
    July 2019
    June 2019
    May 2019
    April 2019
    March 2019
    February 2019
    January 2019
    December 2018
    November 2018
    October 2018
    September 2018
    August 2018
    July 2018
    June 2018
    May 2018
    April 2018
    March 2018
    February 2018
    January 2018
    December 2017
    November 2017
    October 2017
    September 2017
    August 2017
    July 2017
    June 2017
    May 2017
    April 2017
    March 2017
    February 2017

    RSS Feed

All content copyrighted (C) 2010 ~ 2020
​All Photos & Content Used Under Creative Commons
​www.RickRea.com 701-200-7831
Privacy Policy
  • Home
  • Blog
    • Social Media News
    • SEO Marketing News
    • Digital Trends News
    • Photography News
    • Mobile Marketing
    • Business News
    • Gadget News
    • Printing News
  • Contact
  • About
  • Subscribe