But according to new research from the Stanford Internet Observatory, in many cases, platforms have no policies related to discussion of self-harm or suicide at all.
In contrast, Instagram and Reddit have no policies related to suicide in their primary policy documents.â€.
But researchers faulted the company for unclear policies at its Instagram subsidiary; technically, the parent company’s policies all apply to both platforms, but Instagram maintains a separate set of policies that do not explicitly mention posting about suicide, creating some confusion.
Reddit, Parler, and Gab were found to have no public policies related to posts about self-harm, eating disorders, or suicide?
The platforms offer meaningful support in their policies both for people who are recovering from mental health issues and those who may be considering self-harm, the authors said.
Researchers could not find public policies for suicide or self-harm for NextDoor or Clubhouse.
Two is that policies offer platforms a chance to intervene when their users are considering hurting themselves.
(Many do offer users links to resources that can help them in a time of crisis.) And three is that we can’t develop more effective policies for addressing mental health issues online if we don’t know what the policies are.
And even on the most serious of subjects — how to address content related to self-harm — some platforms haven’t even entered the discussion
The Stanford researchers told me they believe they are the first people to even attempt to catalog self-harm policies among the major platforms and make them publicIn the future, I hope these companies collaborate more — learning from one another and adopting policies that make sense for their own platforms