this post was submitted on 19 Jun 2023
21 points (88.9% liked)

The Agora

1601 readers
1 users here now

In the spirit of the Ancient Greek Agora, we invite you to join our vibrant community - a contemporary meeting place for the exchange of ideas, inspired by the practices of old. Just as the Agora served as the heart of public life in Ancient Athens, our platform is designed to be the epicenter of meaningful discussion and thought-provoking dialogue.

Here, you are encouraged to speak your mind, share your insights, and engage in stimulating discussions. This is your opportunity to shape and influence our collective journey, just like the free citizens of Athens who gathered at the Agora to make significant decisions that impacted their society.

You're not alone in your quest for knowledge and understanding. In this community, you'll find support from like-minded individuals who, like you, are eager to explore new perspectives, challenge their preconceptions, and grow intellectually.

Remember, every voice matters and your contribution can make a difference. We believe that through open dialogue, mutual respect, and a shared commitment to discovery, we can foster a community that embodies the democratic spirit of the Agora in our modern world.

Community guidelines
New posts should begin with one of the following:

Only moderators may create a [Vote] post.

Voting History & Results

founded 1 year ago
MODERATORS
 

Didn't want to further derail the exploding heads vote thread, so:

What are the criteria that should be applied when determining whether to defederate from an instance? And should there be a specific process to be followed, and what level of communication if any with the instance admins?

For context it may be useful to look at the history of the Fediblock tag in Mastodon, to see what sorts of stuff folks are dealing with historically in terms of both obvious and unremarkable bad actors (e.g., spam) and conflict over acceptability of types of speech and moderation standards.

(Not saying that folks need to embrace similar standards or practices, but it's useful to know what's been going on all this time, especially for folks who are new to the fediverse.)

For example:

  • Presence of posts that violate this instance's "no bigotry" rule (Does it matter how prolific this type of content is on the target instance?)
  • Instance rules that conflict with this instance's rules directly - if this instance blocks hate speech and the other instance explicitly allows it, for example.
  • Admin non-response or unsatisfactory response to reported posts which violate community rules
    • Not sure if there's a way in lemmy to track incoming/outgoing reports, but it would be useful for the community to have some idea here. NOT saying to expose the content of all reports, just an idea of volume.
  • High volume of bad faith reports from the target instance on users here (e.g., if someone talks about racism here and a hostile instance reports it for "white genocide" or some other bs). This may seem obscure, but it's a real issue on Mastodon.
  • Edited to add: Hosting communities whose stated purpose is to share content bigoted content
  • Coordinating trolling, harassment, etc.

For reference, local rules:

Be respectful. Everyone should feel welcome here.

No bigotry - including racism, sexism, ableism, homophobia, transphobia, or xenophobia.

No Ads / Spamming.

No pornography.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 3 points 1 year ago (1 children)

Throwing up your hands and saying "oh well, you're gonna have to personally deal with all the trolls because we don't want to hurt their feelings to keep our rules in place" isn't better.

When the tools are created, they can refederate and use the tools as needed.

[–] [email protected] 1 points 1 year ago (1 children)

Bad faith reports don't imply actual trolls for users to personally deal with.

Performing moderation actions on good faith reports from users is desirable.

Disconnecting your own users from content they find useful because of the volume of reports that they can't see or prevent, just because you can't be bothered to do the moderation work is undesirable.

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago) (1 children)

Who decides that the majority are bad faith reports?

Users that want access to that content can, as mentioned a hundred times every time defederating comes up, go migrate or make a second account.

The fact is that there are not a lot of tools for mods right now so it's either: A) keep federated and let each individual user block trolls, Or B) defederating until such mod tools are available, which is something that is apparently being worked on. Considering many posters shit on beehaw for defederating when their community is predominantly of a group that receives intense trolling and has a notably higher suicide rate than baseline, and that online harassment is a contributing factor to that level, I don't understand why there is such a pushback until such a time as said tools are available in order to protect the larger community.

But I guess some people who do not have to bear that weight don't appreciate it, and a full throated defenders of free speech and "just asking questions" despite how that has worked out historically as enabling trolls at all levels.

In addition, these instances are growing fast and it will be difficult for mods to keep up with their duties even with a full suite of tools. Defederating is just a way to cool things off while assessing the damage vs potential and putting the most vulnerable first over users who don't personally care that they see said content.

[–] [email protected] 1 points 1 year ago (1 children)

Who decides that the majority are bad faith reports?

The moderation team dealing with reports seem to be the only people in a position to judge bad/good faith for reports.

You seem very interested in intentionally mixing bad-faith reports that only the moderation team sees with other types of misbehavior.

Increased levels of reporting about a local user account is not a good reason to break the user experience for every local account by de-federating.

In addition, these instances are growing fast and it will be difficult for mods to keep up with their duties even with a full suite of tools. Defederating is just a way to cool things off while assessing the damage vs potential and putting the most vulnerable first over users who don't personally care that they see said content.

It is not a temporary action, it's actually not reversible because it breaks links and misses content.

So, no, it's not "a way to cool things off" because it creates more work.

Instances are growing fast, and moderation tools need to get better. However, creating more work for every user on the instance is not an acceptable solution to; the moderation load is increasing but nothing else bad is going on.

[–] [email protected] 1 points 1 year ago (1 children)

How is "create another account if you want to see their content" more work for the individual user than "onus is on you to block each and every troll from that instance and their communities because you can't block the whole instance yet"? They're both a lot of work.

[–] [email protected] 1 points 1 year ago (1 children)

Why does "people are reporting my account in bad faith" mean there are trolls to be blocked to you?

You are really stuck on ignoring the scope of the reason that started this discussion.

You can't actually see the bad faith reports as a user, so there isn't going to be any reason to block a user or an instance from your perspective.

It is the job of the moderation team to protect you from this mess, not to leave you and every other local user to wonder why your communities are less active and replies seem to come out of nowhere.

Then saying that you can just create another account on a different instance to get back to a functional state is adding insult to injury.

There is no way that every user on the instance being asked to move instances is less work than just handling the bad faith reports against one user.

Any admin/moderation team that prioritized themselves over all of their users can't be trusted any longer.

[–] [email protected] 1 points 1 year ago (1 children)

Why does a user crying that people report them in bad faith mean that the user is implicitly telling the truth? Defederating isn't about removing a single user, it's about blocking a community that allows users to post content that goes against the core instances values, like no bigotry. You're arguing that because one person on another instance is crying bad faith reports for their interactions on this instance, we should not look at how that instance is run and instead put the onus on the users to block what they view as trolling against the core values for a whole community?

Or am I missing something?

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (1 children)

You seem to be missing a few things:

  1. It's not the user deciding/complaining. They likely don't know it's happening until the moderation team informs them. The moderator has decided the report is a bad faith report about a local user.

  2. The scope is limited to de-federation of an instance for being the source of these bad faith reports.

  3. No content from the external instance is relevant for this discussion.

  4. I am not arguing what you claim I am arguing.

I don't know how to explain the concept to you. You seem to be very fixated on misunderstanding. Please stop replying to me.

[–] [email protected] 1 points 1 year ago

Ok so by bad faith reporting you mean something akin to brigading?