2
submitted 1 day ago* (last edited 1 day ago) by [email protected] to c/[email protected]

Reddit currently has a feature titled:

“Someone is considering suicide or serious self-harm”

which allows users to flag posts or comments when they are genuinely concerned about someone’s mental health and safety.

When such a report is submitted, Reddit’s system sends an automated private message to the reported user containing mental health support resources, such as contact information for crisis helplines (e.g., the Suicide & Crisis Lifeline, text and chat services, etc.).

In some cases, subreddit moderators are also alerted, although Reddit does not provide a consistent framework for moderator intervention.


The goal of the feature is to offer timely support to users in distress and reduce the likelihood of harm.

However, there have been valid concerns about misuse—such as false reporting to harass users, or a lack of moderation tools or guidance for handling these sensitive situations.


Given Lemmy's decentralized, federated structure and commitment to privacy and free expression, would implementing a similar self-harm concern feature be feasible or desirable on Lemmy?


Some specific questions for the community:

Would this feature be beneficial for Lemmy communities/instances, particularly those dealing with sensitive or personal topics (e.g., mental health, LGBTQ+ support, addiction)?

How could the feature be designed to minimize misuse or trolling, while still reaching people who genuinely need help?

Should moderation teams be involved in these reports? If so, how should that process be managed given the decentralized nature of Lemmy instances?

Could this be opt-in at the instance or community level to preserve autonomy?

Are there existing free, decentralized, or open-source tools/services Lemmy could potentially integrate for providing support resources?


Looking forward to your thoughts—especially from developers, mods, and mental health advocates on the platform.


https://support.reddithelp.com/hc/en-us/articles/360043513931-What-do-I-do-if-someone-talks-about-seriously-hurting-themselves-or-is-considering-suicide

you are viewing a single comment's thread
view the rest of the comments
[-] [email protected] 15 points 1 day ago

The existing reporting framework already works for this. Report those so that they can be removed ASAP.

Mods/admins should not be expected to be mental health professionals, and internet volunteers shouldn't have to shoulder that burden.

this post was submitted on 30 May 2025
2 points (53.6% liked)

Asklemmy

48283 readers
651 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_[email protected]~

founded 6 years ago
MODERATORS