Facebook Is Calling the Cops on Sad Users: Reason Roundup

Facebook protects itself from risk by putting users in danger. Facebook moderators can’t handle determinations like whether a King Cake baby counts as obscenity. Yet the social media giant has nonetheless appointed itself arbiter of your mental health. And if its bots don’t like what they see, Facebook may report you to police.

As part of “suicide prevention efforts,” Facebook “says it has helped first responders conduct thousands of wellness checks globally, based on reports received through its efforts,” reports CNN. Antigone Davis, Facebook’s global head of safety, told the station: “We are using technology to proactively detect content where someone might be expressing thoughts of suicide.”

Since 2011, Facebook has allowed users to flag potential suicidal content; reports prompted emails from Facbook urging the poster to call the National Suicide Prevention Lifeline. But starting in 2017, Facebook introduced bots to search out and report potential suicidal content. The bots report suspected cries for help to human moderators, who may then “work with first responders, such as police departments to send help,” says CNN.

That’s right: Facebook might call the cops on you because a bot thought you seemed sad. Facebook executives think that if a user exhibits signs of depression, its up to Facebook—not just the user’s friends, family, or community—to intervene.

And why not? By preemptively turning users over to authorities, Facebook saves its own butt. As with so many tech-company precautions that they try to frame as being for users’ benefit and public safety, the real protected party here is Facebook itself, which doesn’t want to face criticism and lawsuits if a user who commits suicide could be said to have hinted about it online first.

All it costs is putting people’s lives in danger.

We’ve seen again and again and again how cops called in to deescalate mental health situations wind up hurting and killing those they’ve been called in to help. Cops are not trained psychiatric professionals. Cops are not equipped to talk people out of suicide, nor to assess whether their Facebook posts spell trouble. Cops are not equipped to judge mental health by showing up at someone’s door. And cops are not going to overlook other issues, like drug possession, just because someone is having mental health issues.

Not that Facebook would care. t’s also begun calling police on people for an array of potential situations, including apparently terms-of-service violations? “We may also supply law enforcement with information to help prevent or respond to fraud and other illegal activity, as well as violations of the Facebook Terms,” the Facebook safety page says.

Mental health researchers are now challenging Facebook’s efforts and asking about the ethical implications.

FOLLOWUPS

QUICK HITS

  • The national debt is now more than $22 trillion, a record high, having taken a sharp turn upward in the wake of Trump’s tax cuts.
  • Kentucky police killed a kidnapping victim as they were attempting to save her.
  • U.S. Senators will vote on the Green New Deal:
  • Protecting and serving:
  • “These folks need to get a life. And they can FOIA that!”
  • Voiceprints are here:

from Hit & Run http://bit.ly/2GmGcEJ
via IFTTT

Leave a Reply

Your email address will not be published. Required fields are marked *