BOB: And I’m Bob Garfield.. Two weeks ago, in a Washington State High School, a 15-year old homecoming prince Jaylen Fryberg shot four of his schoolmates and himself, maybe it was over a girl, maybe it was something else. No one can say for sure. But his Twitter feed was filled with rage and despair… “It breaks me,” he wrote. “It actually does..I know I seem to be sweating it off...but I’m not...And I never will be able to.” That was three days before the shooting. Another tweet: “It won’t last. It’ll never last.” That was the night before. Could an app have prevented what happened next? For 61 years the UK suicide prevention charity called Samaritans has been coming to the aid of vulnerable strangers at risk of doing themselves harm. Well, now there’s an app for that. The Samaritans Radar which monitors the Twitter feeds of those the user follows and flags language that may suggest suicidal thoughts. If you download Samaritans Radar and, say, follow me on Twitter, and I tweet ‘I just want to end it all’ or ‘help me,’ you will be alerted - and you can spring into action. David Meyer wrote about the app for the technology website GigaOm, and at first was bullish on the idea of applying digital technology to help prevent tragedy. But others saw all sorts of problems with the Samaritans radar. And now, so does he.
MEYER: Quite a few people pointed out that not everyone that follows somebody is that person's friend, so people could use this if they are so-called trolls-abusive internet people - to find people to pick on as it were. This person 's feeling low I'm going too try to make them feel lower. Another very commonly expressed criticism was that a lot of people use Twitter to just get stuff off their chest. They're not looking for a response they're not using it as a cry for help. one of the recurring things that I've seen is that people who are suffering from mental health issues use Twitter to get stuff off their chest have said that they will leave Twitter because they don't find it a safe place anymore. And some people have claimed already that they have left Twitter because of this app.
BOB: And then of course, there are false positives. Something innocuous in a tweet that the Samartins radar identifies as a potential cry for help.
MEYER: Well, certainly a problem. People are prone to exaggeration for comedic effect let's say. Local store out of your favorite brand of cereal. 'Oh I could just die' you know, 'I want to end this all now.' This is something that searches for key phrase, key words. It doesn't recognize sarcasm and irony. The problem there is it may well worry somebody as well. I mean lets say somebody is following you who does actually care for you, they will get notification saying that this person is sounding distressed. They may not know the context. Sort of the 'I want to end it' or it might be, let's say a separate tweet that follows on from a previous tweet that's established the context and one of the key problems here as well is that the person who actually made the tweet, they don't know somebody else has been sent an alert saying that they are sounding possibility suicidal. But only people who actually opt are the people who opt in to get the alerts about their followers.
BOB: Another kind of haunting quality to Samaritans Radar, is while it is not Big Brother it's hundreds of thousands of little brothers, who, without your knowledge whatsoever are suddenly monitoring your emotional state on a tweet-by-tweet basis. That's kind of creepy.
MEYER: It is. It's creepy because of the automation of it. The fact that you've essentially said, 'automation system, keep an eye on this person for me.' I mean, let's face it, Twitter is to a great extent a public medium. But this kind of an automated stalking mechanism. One with good attentions, but that's sort of what it is anyway
BOB: What about the legality of Samaritans Radar in Europe, especially where data privacy has been regulated and legislated certainly more vigorously than it has here.
MEYER: It is a dubious legality. A lot of people don't see this mechanism with people in compliance with basic protections law. You know, even if you pull back from the specific legalities of it, in terms of ethics it really comes down to consent. The Samaritans keep pointing out that people opt-in to the system but the people who opt-in are the people who are doing the watching. Not the people who are being watched. And that's a problem. It's important I think, that the Samaritans are also already the suicide prevention partners of Twitter and Facebook in the UK and the Republic of Ireland as well. I mean if you see somebody's tweet and you think of that as sounding at risk, before the system came in you could already report it as it were to Twitter or Facebook who would then pass it on to the Samaritans who would try to get in touch with that person. But again it's the automation of this that's the issue.
BOB: Nobody has imputed to the Samaritans any kind of sinister intentions. They've been at this now for more than 60 years. But, the backlash has been significant. How has the organization reacted.
MEYER: A lot of people are seeing it as having a little bit of a tin ear in this. You know? It doesn't seem to be taking into account the crux of the complaints. They say that it's legal, they say that they've taken legal advice. They claim that they're not even processes of the data. An absurd statement to make unless somebody else is running the app, which doesn't seem to be the case. This is something that clearly comes, you know, with good intentions. It was worked out with researchers, it's trying to solve a problem and to a large extent it might solve that problem, but it's actually creating other problems. Which the people who are supposed to be helped here find very distressing, frankly.
BOB: David, I thank you.
MEYER: It was a pleasure.
BOB: David Meyer is Senior Writer for GigaOm covering Europe and issues related to privacy and security.