this post was submitted on 29 Aug 2023
503 points (98.1% liked)

Meta (lemm.ee)

3581 readers
15 users here now

lemm.ee Meta

This is a community for discussion about this particular Lemmy instance.

News and updates about lemm.ee will be posted here, so if that's something that interests you, make sure to subscribe!


Rules:


If you're a Discord user, you can also join our Discord server: https://discord.gg/XM9nZwUn9K

Discord is only a back-up channel, [email protected] will always be the main place for lemm.ee communications.


If you need help with anything, please post in !support instead.

founded 1 year ago
MODERATORS
 

Hey folks!

I made a short post last night explaining why image uploads had been disabled. This was in the middle of the night for me, so I did not have time to go into a lot of detail, but I'm writing a more detailed post now to clear up where we are now and where we plan to go.

What's the problem?

As shared by the lemmy.world team, over the past few days, some people have been spamming one of their communities with CSAM images. Lemmy has been attacked in various ways before, but this is clearly on a whole new level of depravity, as it's first and foremost an attack on actual victims of child abuse, in addition to being an attack on the users and admins on Lemmy.

What's the solution?

I am putting together a plan, both for the short term and for the longer term, to combat and prevent such content from ever reaching lemm.ee servers.

For the immediate future, I am taking the following steps:

1) Image uploads are completely disabled for all users

This is a drastic measure, and I am aware that it's the opposite of what many of our users have been hoping, but at the moment, we simply don't have the necessary tools to safely handle uploaded images.

2) All images which have federated in from other instances will be deleted from our servers, without any exception

At this point, we have millions of such images, and I am planning to just indiscriminately purge all of them. Posts from other instances will not be broken after the deletion, the deleted images will simply be loaded directly from other instances.

3) I will apply a small patch to the Lemmy backend running on lemm.ee to prevent images from other instances from being downloaded to our servers

Lemmy has always loaded some images directly from other servers, while saving other images locally to serve directly. I am eliminating the second option for the time being, forcing all images uploaded on external instances to always be loaded from those servers. This will somewhat increase the amount of servers which users will fetch images from when opening lemm.ee, which certainly has downsides, but I believe this is preferable to opening up our servers to potentially illegal content.

For the longer term, I have some further ideas:

4) Invite-based registrations

I believe that one of the best ways to effectively combat spam and malicious users is to implement an invite system on Lemmy. I have wanted to work on such a system ever since I first set up this instance, but real life and other things have been getting in the way, so I haven't had a chance. However, with the current situation, I believe this feature is more important then ever, and I'm very hopeful I will be able to make time to work on it very soon.

My idea would be to grant our users a few invites, which would replenish every month if used. An invite will be required to sign up on lemm.ee after that point. The system will keep track of the invite hierarchy, and in extreme cases (such as spambot sign-ups), inviters may be held responsible for rule breaking users they have invited.

While this will certainly create a barrier of entry to signing up on lemm.ee, we are already one of the biggest instances, and I think at this point, such a barrier will do more good than harm.

5) Account requirements for specific activities

This is something that many admins and mods have been discussing for a while now, and I believe it would be an important feature for lemm.ee as well. Essentially, I would like to limit certain activities to users which meet specific requirements (maybe account age, amount of comments, etc). These activities might include things like image uploads, community creation, perhaps even private messages.

This could in theory limit creation of new accounts just to break rules (or laws).

6) Automated ML based NSFW scanning for all uploaded images

I think it makes sense to apply automatic scanning on all images before we save them on our servers, and if it's flagged as NSFW, then we don't accept the upload. While machine learning is not 100% accurate and will produce false positives, I believe this is a trade-off that we simply need to accept at this point. Not only will this help against any potential CSAM, it will also help us better enforce our "no pornography" rule.

This would potentially also allow us to resume caching images from other instances, which will improve both performance and privacy on lemm.ee.


With all of the above in place, I believe we will be able to re-enable image uploads with a much higher degree of safety. Of course, most of these ideas come with some significant downsides, but please keep in mind that users posting CSAM present an existential threat to Lemmy (in addition to just being absolutely morally disgusting and actively harmful to the victims of the abuse). If the choice is between having a Lemmy instance with some restrictions, or not having a Lemmy instance at all, then I think the restrictions are the better option.

I also would appreciate your patience in this matter, as all of the long term plans require additional development, and while this is currently a high priority issue for all Lemmy admins, we are all still volunteers and do not have the freedom to dedicate huge amounts of hours to working on new features.


As always, your feedback and thoughts are appreciated, so please feel free to leave a comment if you disagree with any of the plans or if you have any suggestions on how to improve them.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 127 points 1 year ago (5 children)

Personally I say just leave hosting of images to dedicated sites for that purpose. Your efforts are better left to dealing with how to render them. That being said, I use to be in charge of managing abuse on a site that has an average of 20 million posts a month (seriously).

The way I essentially defeated these kinds of attacks was with an image scanning service. It scans for anything NSFW and blocks it. Sometimes things would make it through but once an admin flagged it we could use that to block the users IP and account. It’s not cheap but the volume is also not huge yet for lemm.ee so it might not be too bad.

[–] [email protected] 60 points 1 year ago (1 children)

This is my opinion also. Reddit turned to shit around the time they started self-hosting. Imgur only exists because people needed a place to host reddit images.

[–] [email protected] 15 points 1 year ago (3 children)

Is there a fediverse instance of Imgur?

[–] [email protected] 36 points 1 year ago* (last edited 1 year ago) (3 children)

No, but there's nothing stopping you from using direct links from imgur, in traditional fashion.

It's a little bit convoluted, though. You have to post the image, then hover over and select "Get share links", and then pick the option for BB code (forums). This has the [img] tags at the start and finish, but importantly it has the direct link to the image file. If you use this on lemmy then it will load in the instance, rather than directing to imgur itself.

load more comments (3 replies)
[–] [email protected] 15 points 1 year ago

I've seen people link to uploads on Pixelfed, though this is probably not the intended use case.

load more comments (1 replies)
[–] [email protected] 18 points 1 year ago

Yeah genuinely we could all be hosting images for free or cheap on several image sites. Even NSFW images and videos! And it would save our instance admins a lot of headaches and probably some cost too.

load more comments (3 replies)
[–] [email protected] 82 points 1 year ago (3 children)

You forgot getting the authorities involved when somebody does upload csam

[–] [email protected] 31 points 1 year ago

It's a known tactic by trolls to upload cheese pizza and then notify the media/the authorities themselves because context doesn't matter when it comes to CSAM

[–] [email protected] 14 points 1 year ago

The Lemmy.world team is getting some authorities involved already for this particular case. I am definitely in favor of notifying law enforcement or revelant organizations, and if anybody tries to use lemm.ee to spread such things, I will definitely be involving my local authorities as well.

[–] [email protected] 11 points 1 year ago (1 children)

getting the authorities involved

How do you imagine that playing out? This isn't some paedophile ring trading openly, this is people using CSAM as an attack vector. Getting over-enthusiastic police involved is exactly their goal, and will likely do very little to help the victims in the CSAM itself.

Yes, authorities should be notified and the material provided to the relevant agencies for examination. However that isn't truly the focus of what's happening here. There is no immediate threat to children with this attack.

[–] [email protected] 41 points 1 year ago (1 children)

How do you imagine that playing out?

FBI: Whoa that illegal

Admin: Ya

FBI: We're going to look for this guy

Admin: alright

END ACT 1

[–] [email protected] 12 points 1 year ago (2 children)

This isn't something the FBI have much involvement with. The FBI deal with matters across states.

This isn't America, where you have a bunch of separate states unified under one American government. People haven't been posting porn to lemm.ee. People have been posting porn to other instances, which has seeped through to lemm.ee.

Getting the Estonian law enforcement involved is like trying to get the Californian government involved in dealing with a problem from Texas. Estonian law enforcement have no jurisdiction over lemmy.world or any other instance, and giving them an opportunity is only going to lead to locking down lawful association and communication in favour of some vague "think of the children" rhetoric. And, like I say, it won't do anything to curtail the production of CSAM as the purpose of this attack has little to do with the promotion of CSAM.

Frankly, it could easily be more like:

lemm.ee: We've got a problem with illegal content

Estonian law enforcement: Woah that's illegal.

Estonian law enforcement: You've admitted to hosting illegal content. We're going to confiscate all your stuff.

lemm.ee is shut down pending investigation.

Meanwhile, if lemm.ee continues its current course of action, yet someone notifies law enforcement:

Estonian law enforcement: Woah, we've got a report of something dodgy, that's illegal.

lemm.ee: People tried to post illegal content elsewhere that could have come to our site, we blocked and deleted it to the best of our ability.

Estonian law enforcement: Fair enough, we'll see what we can figure out.

It really matters how and when the problem is presented to law enforcement. If you report yourself, they're much more likely to take action against yourself than if someone else reports you. It doesn't do yourself any favours to present your transgressions to them, not unless you're absolutely certain you're squeeky clean.

At this stage and in these circumstances, corrective action is more important than reporting.

load more comments (2 replies)
[–] [email protected] 61 points 1 year ago (3 children)

For step 6 - are you aware of the tooling the admin at dbzero has built to automate the scanning of images in Lemmy instances? It looks pretty promising.

[–] [email protected] 20 points 1 year ago (9 children)

Yep, I've already tested it and it's one of the options I am considering implementing for lemm.ee as well.

load more comments (9 replies)
[–] [email protected] 15 points 1 year ago (1 children)

It seems promising but also incomplete for US hosts, as our laws do not allow deletion of CSAM rather it must be saved and preserved and sent to a central authority and not deleted until they give the okay. Rofl.

I also wonder if this solution will use PHash or other hashing to filter out known and unaltered CSAM images (without actually comparing the images, rather their metadata).

load more comments (1 replies)
load more comments (1 replies)
[–] [email protected] 53 points 1 year ago (3 children)

IMO Lemmy shouldn't have media uploading of any kind. Aside from the CSAM risk, it's unsustainable and I think one of the reasons Reddit went to shit is by getting into the whole image/video/gif hosting.

Dozens of media hosts exist out there, and the mobile/web clients should focus instead on showing remote content better.

[–] [email protected] 36 points 1 year ago (1 children)

The flip side of the argument is that if you also host the media you are not at risk of having broken links. I’ve seen a number of long running forums that had post bodies that contained external images that are now broken.

Of course an argument can be made that the only reason that those forums have lived for so long was due to not having costs associated with hosting media.

[–] [email protected] 10 points 1 year ago (2 children)

That's no worse than a reddit link getting borked because it's been cross-posted and someone managed to kill the original link with a DMCA notice.

load more comments (2 replies)
load more comments (2 replies)
[–] [email protected] 50 points 1 year ago* (last edited 1 year ago) (4 children)

Please please do not implement an invite system.

The success of a forum like this depends on people being able to join and express their thoughts freely. Reddit and digg would never have gotten where they are if they had a closed system.

I almost didn't join lemmy because the first two instances I heard about (lemmy.ml and beehaw) had closed registration. I think I applied and then forgot about it for 2 weeks. Thankfully I saw a post about lemmy on reddit yet again and finally found an open instance.

Don't let the actions of a few scumbags ruin a good thing for everyone. You'll be giving them exactly what they want.

[–] [email protected] 42 points 1 year ago (1 children)

I agree that users should be able to join Lemmy freely, but I think it makes a lot of sense to try and spread users out more between instances - this spreads out the responsibilities between more admins, spreads out the load between more servers and also reduces the chance of a single point of failure for the whole system.

It's clear that there are seriously vile people out there who want to cause huge amounts of damage to Lemmy, and if we have unlimited growth in a few selected instances, then these people only have to target those specific instances for maximum damage.

In a perfect world, none of this would be necessary, but then again, in a perfect world, we wouldn't need a decentralized platform in the first place.

[–] [email protected] 12 points 1 year ago* (last edited 1 year ago) (2 children)

Thanks for responding!

I agree that it's best for the lemmyverse.net if there are many big instances too.

Unfortunately, the concept of the fediverse isn't as easy to understand. The average newcomer (who mostly just wants to consume content and occasionally ask a question or two) starts off by interacting within their instance, and it takes some time to figure out cross-instance communication (there are still posts about this on the nostupidquestions-type communities). For such users, landing on a small instance means they'll poke around the Local active posts, think that "this forum is dead", and never return.

Like reddit, having a large userbase on lemmyverse is important to keep the conversation interesting (see https://i.imgur.com/4tXHAO0.png). Reddit has provided lemmy with a huge shot at success by injecting a large number of users. But if I'm being honest, the conversation on the lemmyverse isn't as diverse and engaging as it is on reddit yet. This isn't self-sustaining yet. I can point to 2 pieces of evidence to support this:

  1. Using Voat as a (imperfect) proxy - I don't know if there are official stats of Voat, but the best dataset I've seen for Voat (https://ojs.aaai.org/index.php/ICWSM/article/download/19382/19154/23395) has 16.2M comments in 2.3M submissions from 113k users. Voat was shut down for lack of funding, but even in its heyday it wasn't exactly thriving - many people on Voat were united in their toxicity and it never really got going. Compare these numbers to the lemmyverse which has about 100k active users over the last 6 months. If the fediverse is to grow beyond "that niche forum for nerds", this userbase isn't enough.

  2. It's already clear that the number of active users is decreasing - since mid-July, the number of monthly active users has dropped from 70k to 50k. This is expected (bunch of redditors who joined in June, poked around and said hi and left), but it means if the lemmyverse wants to have any chance of succeeding long term, you can't alienate new users now.

The approach I've been advocating since the beginning of lemmy is:

  • if you see a user who's interested in lemmy but isn't really tech savvy, just point them to one of the biggest instances. Don't explain what federation is, leave it as a feature to be discovered once they're engaged.
  • if you see a user who's interested in the concept of a fediverse and wants to know how it works, explain federation and send them to a smaller instance.

The way federation works now, it's still disadvantageous to be on a smaller instance (discoverability of new communities is harder, syncing posts/comments isn't always fast, it's hard to know which community is more active. Many of these can be fixed with changes to activitypub and lemmy protocol, but in the meantime, sending casual users to small instances means they'll likely never return.

So to sum up, I think there should be an avenue for casual users to join the biggest instances, even as we encourage people to move to smaller ones (either targeting those who are more tech savvy, or those who have already been on Lemmy long enough to know how it works - I myself was on Lemmy.world and switched to this "smaller" instance).

Anyway, you're the admins here and I have no say over what you eventually do. I'm just hoping you'll consider the practical realities of user behavior - everyone wants what's best for the fediverse in the long term.

load more comments (2 replies)
[–] [email protected] 25 points 1 year ago (1 children)

If I may, lemm.ee is now the second biggest instance. Redirecting people to register on local instances (feddit.country) or generalist ones (reddthat.com, Lemmy.today, discuss.online etc.) couldebe reasonable to make those ones grow as well.

I agree that there should be a clear lists of instances open for registrations, but that probably needs to wait for the dust to settle a bit beforehand

load more comments (1 replies)
[–] [email protected] 11 points 1 year ago (2 children)

While I understand your concerns, this instance has gotten a fair bit larger and will start to suffer the same issues that lemmy.world does if registrations aren't curbed. It can't grow infinitely. That just isn't feasible for one server. Having closed registrations on lemm.ee doesn't stop anyone from signing up on different instances. A solution might be to temporarily limit registration here in some way, and for the devs and instance admins to find a better way of helping new users choose an instance. The initial sign up process was confusing, and could be streamlined to make it easier for people to choose an instance. In the long term, enhancing the way federation works so users who do sign up on smaller/newer instances don't need to be lemmy savvy to find content would also help alleviate that type of issue.

load more comments (2 replies)
load more comments (1 replies)
[–] [email protected] 32 points 1 year ago (5 children)

All of this seems good to me except 4 - I hate the thought of any instances being invited only. I'd much prefer it was just a verified user approach (even just an email) with a waiting period for doing things like posting images. Maybe even limit newish users after that period to a small number of image posts a day.

Making an instance feel like a club is going to turn off a lot of people. For sure do what you need to do, but I hope you can avoid that one.

load more comments (5 replies)
[–] [email protected] 27 points 1 year ago* (last edited 1 year ago)

thank you for your work sunaurus, and i'm sorry you had to sort through this

(particularly annoying though, as i never got around to adding a user banner; and i had one in mind as well. i wish there was some way to externally host avatars and banners)

[–] [email protected] 27 points 1 year ago* (last edited 1 year ago) (4 children)

Forums have existed on the internet forever and and have already dealt with this thousands of times previously. You don't need to overthink it or reinvent the wheel. It didn't stop forums existing very comfortably in the past and isn't an issue that should be that different to deal with today.

Simply limit image uploads to a certain account age threshold and karma threshold and you will eliminate 99% of the ability to abuse this.

load more comments (4 replies)
[–] [email protected] 23 points 1 year ago (1 children)

I'm going to be a part of an invite only community?! Of course, given the circumstances, this is pretty fucked. But I feel kinda fancy right now.

Thanks for all you do on lemm.ee

load more comments (1 replies)
[–] [email protected] 23 points 1 year ago (1 children)

I left Twitter before musk, when the security chief said that they know they have CP but they were doing nothing.

I can forgive a measure that doesn't work as expected or at 100% but not the inactivity.

Therefore I'm agree with any measure you think it can work despite any inconveniences for me.

Sorry for any misspelled or wrong word, English isn't my main language

Regards and thanks for all your efforts.

load more comments (1 replies)
[–] [email protected] 18 points 1 year ago

This has been a great instance since day one, and it's good to see you once again being so proactive. Thank you for the update!

There are downsides with all kinds of moderation, but ultimately most of us accept that the internet can't function as a true free-for-all. Absolutely in support of whatever you feel is necessary to keep the server safe, but please watch out for yourself too and make sure you're asking for help where needed.

p.s. anyone reading this who doesn't donate to the server yet, here's a reminder that that's a thing you can do.

[–] [email protected] 17 points 1 year ago (1 children)

Could you post a guide on disabling the local image cache? I compile from scratch so I’m not afraid of making changes in the code, I just don’t really know rust. I shut down my personal instance and this would allow me to turn it back on.

load more comments (1 replies)
[–] [email protected] 16 points 1 year ago (6 children)

Got to be honest, having an invite based system and locking certian features behind age of accounts, karma, etc seems like the opposite of the freedom everyone promised me the Fediverse represented when we moved over.

I personally don't really care about images and would prefer image uploads just stay deactivated and we operate as a text only forum but with open membership.

load more comments (6 replies)
[–] [email protected] 16 points 1 year ago (4 children)

This is something that many admins and mods have been discussing for a while now, and I believe it would be an important feature for lemm.ee as well. Essentially, I would like to limit certain activities to users which meet specific requirements (maybe account age, amount of comments, etc). These activities might include things like image uploads, community creation, perhaps even private messages.

Sounds like the old karma requirements some reddit subs had. While I'm not against that, it would restrict locally registered users more so than others who are posting on lemm.ee communities when their host instance has no such system in place. I'm aware that if they post images those would be uploaded to their home instance and linked here with the patch you mentioned above, but the downside is that local users might feel inconvenienced more so than others. Not saying it's a bad idea though, if we are thinking from a "protect lemm.ee" angle first and foremost.

Automated ML based NSFW scanning for all uploaded images

You might want to reach out to the dev of Sync for Lemmy, ljdawson on [email protected], he just implemented an anti-NSFW upload feature in the app to do his part. Essentially, Sync users currently can't post any kind of porn. While I don't think that the CP spammers were using his particular app, or any app to begin with, I do think it's a neat feature to have, but would make much more sense to run server-side.

load more comments (4 replies)
[–] [email protected] 15 points 1 year ago

I like almost everything on this plan, except for the last 2 items. The account requirements for "extra activities" best be chosen carefully as to not encourage the good old "karma farming" that we got away from in leaving Reddit.
And the ML thing for recognizing NSFW is also something to be carefully considered. Too strict and it gets annoying with false positives, it can restrict posting actual content, and too lax won't make a difference for the people actually looking to circumvent it. I think a "vetting" system like the previous item could be better in the long run, in only letting "trusted" people upload content.

[–] [email protected] 14 points 1 year ago (3 children)

I hope there is another option besides just deleting images indiscriminately. I run several comic strip communities and it would be a shame to lose all the posts and work I've put in.

What about implementing Imgur or something similar, assuming they scan for CSAM on their end. For example I often use the Lemmy iOS app and I noticed that all my image uploads using the app are through Imgur.

[–] [email protected] 17 points 1 year ago (1 children)

@TWeaK is correct, I am only deleting our copies of images which are already hosted on other instances.

As for imgur (or any other external image host), such images have always worked on lemm.ee. For example, this is hosted on imgur:

pac-man

In addition to using external images in comments, you are also able to submit posts with imgur images, and they will get embedded directly into the Lemmy UI.

load more comments (1 replies)
[–] [email protected] 10 points 1 year ago (1 children)

You wouldn't lose the posts you've made, rather the posts you've made will be hosted from one instance, rather than all of them.

You're a lemm.ee user, if you upload to a lemm.ee community nothing will change.

If you upload to another community, then normally you're post would be uploaded to lemm.ee. This would then be federated, and users from other instances would load the same content, but it would be delivered by their own instance.

The change refers to things beings hosted only in your host instance. Thus, a lemm.ee user may load content from a lemmy.world server more often. Normally, lemm.ee would copy the content to its own servers and direct its users to that, but now everything will go to the host instance.

The only thing I'm not sure about is who is the host instance? My understanding is that the host instance is that which the user belongs to. Thus, if a lemm.ee user posts to a community in lemmy.world, techincally the federated host instance is still lemm.ee - it's about the user, not the community. But with all this I'm not sure.

load more comments (1 replies)
load more comments (1 replies)
[–] [email protected] 14 points 1 year ago

Thanks for keeping the community updated and for all the work you put into maintaining it!

[–] [email protected] 13 points 1 year ago (1 children)

I prefer a more text based main post experience so this is gonna be good for me. Reddit used to be a fantastic discussion forum until every single post on /all was either an image post or video post. I wish there was a way to completely disable media posts so I could just view discussion posts.

load more comments (1 replies)
[–] [email protected] 11 points 1 year ago (2 children)

Lemmy admins need to do whatever it is they can to handle CSAM if and when it arises. Users need to be understanding in this because as I’ve argued in other threads, CSAM itself poses a threat to the instance itself, as it poses a threat to the admins if they cannot clean up the material in a timely manner.

This is going to likely get weird for a bit, including but not limited to:

  • instances going offline temporarily
  • communities going offline temporarily
  • image uploads being turned off
  • sign ups being disabled
  • applications and approval processes for sign ups
  • ip or geoip limiting (not sure if this feature currently exists in lemmy, I suspect it doesn’t but this is merely a guess)
  • totally SFW images being flagged as CSAM. Not advocating against use of ML / CV approaches, but historically they aren’t 100% and have gotten legit users incorrectly flagged. Example

I just want folks to know that major sites like reddit and facebook usually have (not very well) paid teams of people who’s sole job is to remove this material. Lemmy has overworked volunteers. Please have patience, and if you feel like arguing about why any of the methods I mentioned above are BS or have any questions reply to this message.

I’m not an admin, but I’m planning on being one and I’m sort of getting a feel for how the community responds to this sort of action. We don’t get to see it a lot in major social media sites because they aren’t as transparent (or as understaffed) as lemmy instances are.

load more comments (2 replies)
[–] [email protected] 11 points 1 year ago

Great plan, love seeing someone try to combat this!

[–] [email protected] 10 points 1 year ago

Thanks for this.

As others have pointed out, perhaps just sticking to external hosting for images would make the most sense as long as the costs are manageable.

[–] [email protected] 10 points 1 year ago (1 children)

If every instance did the same, Lemmy as a whole wouldn't support images anymore.

Not that I see a better solution though. Do what you've got to do

[–] [email protected] 21 points 1 year ago

To be clear, only image uploads to lemm.ee are disabled, posting images which are hosted externally is still enabled.

[–] [email protected] 9 points 1 year ago

Seems like a good plan. I have been very impressed with your approach to administer ing lemm.ee.

Regarding the planned invite system, what would be the consequences of inviting a malicious user? I would think it would be hard to enforce any consequences simply because of the open nature of lemmy as an ecosystem.

[–] [email protected] 9 points 1 year ago (2 children)

I will apply a small patch to the Lemmy backend running on lemm.ee to prevent images from other instances from being downloaded to our servers

If possible, could you tell others how to apply this patch to their own server?

load more comments (2 replies)
load more comments
view more: next ›