this post was submitted on 28 Aug 2023
1030 points (97.2% liked)

Memes

45928 readers
1455 users here now

Rules:

  1. Be civil and nice.
  2. Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.

founded 5 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 118 points 1 year ago (1 children)

Tell your friend to log the IP address and report it to the authorities. They might need to turn over the entire modlog as well

[–] [email protected] 55 points 1 year ago

This is likely in reference to the federation of such images posted elsewhere

[–] [email protected] 75 points 1 year ago (1 children)

There's always someone who doesn't mind ruining it for everyone else. Probably safest to just delete all the images, that way there's no need to look.

[–] [email protected] 63 points 1 year ago (1 children)

Bad actors will try to nuke the entire platform to maintain a monopoly on this format of communication and community.

[–] [email protected] 35 points 1 year ago (1 children)

Who could you posspezibly be referring to?

[–] [email protected] 3 points 1 year ago

Is it the android? The lone skum? Or someone else entirely?

[–] [email protected] 64 points 1 year ago

Once again reaffirming why I refuse to host an instance. If I ever do, I’m not federating with any of you degenerates lol

[–] [email protected] 20 points 1 year ago (1 children)

I'm glad s/he was able to nuke the CSAM, even if other material was nuked with it. This crap is why I'm not hosting.

Please, call it CSAM (child sexual abuse material) and not CP (child pornography). The children in these photos/videos can't make pornography, they're sexually abused into making this material. CP insinuates that it's legitimate porn with children. CSAM, on the other hand, calls it what it is: sexual abuse of children.

[–] [email protected] 32 points 1 year ago (10 children)

That is needlessly pedantic. I have never heard of anyone using the word pornography to imply legality or moral acceptability. There is no such thing as "legitimate" CP, so there is no need to specify that it's not ok every time it is mentioned. No one in their right mind would presume he's some kind of CP supporting monster for failing to do so.

[–] [email protected] 12 points 1 year ago

If we spent more time fixing things rather than naming them the world would be a better place.

[–] [email protected] 4 points 1 year ago* (last edited 1 year ago) (2 children)

No one in their right mind would assume that OP is. But the term was created to legitimize the material. So, while you're correct in that it is picky, it is also picky for a reason. Words are powerful. We should fight to not empower the legitimation of that term, among other things.

load more comments (2 replies)
load more comments (8 replies)
[–] [email protected] 15 points 1 year ago (1 children)
[–] [email protected] 9 points 1 year ago

I know that guy Tobias Fünke, althought he also is a analysist. He had some clever abreviation for that as well!

[–] [email protected] 13 points 1 year ago

Bless you ❤️

[–] [email protected] 12 points 1 year ago (2 children)

I'm not gonna lie, I'm surprised it took this long for some dipshit to try something like this. Lemmy's security has more holes in it than a piece of Swiss cheese and we're fools if we think it's viable enough for it to serve as a long-term home for new social media.

We really, really need a better social structure than federation.

[–] [email protected] 16 points 1 year ago (1 children)

Lemmy’s security has more holes in it than a piece of Swiss cheese

This has very little to do with security. There's inherently "insecure" about posting CSAM, since the accounts and images were likely posted just like any other.

What really needs to happen, is some sort of detection of that kind of content (which would likely require a large change to code) or additional moderation tools.

[–] [email protected] 5 points 1 year ago (1 children)

The lack of those tools is what I was talking about

[–] [email protected] 11 points 1 year ago (1 children)

Ah okay, those arent generally considered security but I can understand why you went that route I suppose.

[–] [email protected] 3 points 1 year ago (1 children)

Does anyone know why they were never put in?

[–] [email protected] 6 points 1 year ago

Software development is a balancing act. You need to pick and choose not only what features to add, but when to add them. Sometimes, mistakes are made in the planning and you get a situation like this.

What likely happened, is that these kinds of features were deemed less likely to be needed, since the majority of lemmy users will never run into the need of them and there is technically a way to handle the situation (nuking your instances image cache.) But you'll likely see a reshuffling of priorities if these kinds of attacks become more prevalent.

[–] [email protected] 9 points 1 year ago (1 children)

Lemmy's security

I think you mis-spelled moderation tools, nice quick fix would have been to block posts from new users on X instance and have a pinned post briefly covering why - they'll eventually run out of instances that don't have open signups IMO or just give up.

Another mod tools option would be rate limiting of posts, i.e. users can only make a new shitpost every 10-15min, rather than unlimited times per minute

load more comments (1 replies)
[–] [email protected] 10 points 1 year ago (1 children)

In the meanwhile, my YunoHost based instance that still hasn't managed to make Pict-RS work and therefore can't even store images even if it wanted to is doing juuuuust fine

[–] [email protected] 6 points 1 year ago

Come to think of it, if you're the only user, it's kinda protecting you, isn't it? (hello fellow Yunohost user!)

load more comments
view more: next ›