While I'm at it, let's talk about suicide monitoring software that school districts use: 🧵👇
This software sends an alert when students enter keywords on their school computers related to suicide. Whether this is ethical is not something I am qualified to answer. However, let's talk about the more clinical aspects of this practice.
First - flagging these entries is a form of screening. With this type of screening, you are guaranteed to have many many false positives because with screening you want to identify all the true cases and are thus willing to have false positives in order to do so.
Second - Keep in mind is that estimating risk is not the same as accurately predicting an outcome. We cannot predict who will die by suicide. To what degree typing certain words in a computer indicates increased risk (and what that increase would mean) is not yet clear.
Okay - so beyond the ethics of the surveillance itself - the devil in the details is how to respond to a hit from this software. Schools have very limited resources - especially if a hit comes after hours.
In the treatment literature for suicidality - researchers consistently emphasize that we need to move away from 'invasive' intervention such as using the police or hospitalizing someone in all but a handful of key circumstances because the harms likely outweigh the benefits.
However, schools have limited options beyond these measures. Maybe a counselor conducts an appropriate risk screening. But if that screening indicates risk - many schools are back to the same options: police/hospital.
This is especially true given that schools are worried about the liability if they do not send the student to the hospital - even if it's not clinically indicated. And there is truth in position that schools are not designed for nor responsible for providing healthcare.
This of course brings larger issues into the picture: a lack of a true mental healthcare system for youth, lack of universal health insurance, lack of evidence-based treatment in the community, highly qualified providers not taking insurance, etc.
So far I've talked about police/hospital not being clinically indicated - this doesn't even touch the real danger people of color can face in these interactions whether or not they're indicated.
The point being - false positives have serious costs associated with them to the student, family, & school in addition to ethics concerns with surveillance. Until we have a solid system for addressing hits, we have no way to know if software like this does more harm than good.
You can follow @ereinbergs.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled:

By continuing to use the site, you are consenting to the use of cookies as explained in our Cookie Policy to improve your experience.