Update: I contacted with current big owners, other older friends and lastly from some friends from here. Mostly all of them living in US so they don’t want/can’t host it. So I’ll keep hosting without being on moderation side. @[email protected] will post about details I guess. @[email protected] is the new top admin.
As you know, it has been 2 weeks since I opened the instance and it has grown quite a lot. Likewise, the time I have to devote to this work has increased a lot.
I'm dealing with lemmynsfw more than my IRL job right now :D This is bothering me. Also, having an NSFW instance instead of a normal instance makes things much more difficult. If you remember; I had my biggest scale fuck up with the post "we allow loli content" :) This situation wore me out. Also a lot of problems are bothering me, both as a software and as a community.
That's why I'm thinking of transferring the instance and the domain to a person I trust. Who can maintain the deployments and also know this stuff. I will also roll over any donations made, excluding the current month's expenses.
I'm sorry if I've upset anyone. That's all from me.
Ok so I don't actually see any major change to the codebase there that helps protect the instance owner. The js patch doesn't seem to be working right, I don't see those blur/autoload controls in my sidebar on desktop. And to the server the only commit is apparently to "show nsfw by default".
I think you would be wise to consider more patching to handle things like auto-purge on certain reports, ensure that deleted content is actually deleted from the server, etc. I haven't reviewed the full lemmy codebase but I'm not convinced it's going to be enough to keep things on the up-and-up in an instance like this.
The JS patch was deployed for a very short time (remember when the NSFW box was showing checked but wasn't? Yeah). Any further development seems to have been done on the Lemmy code itself, under different tags (not branches)
Lol I don't think I was here yet for that, but that makes sense. The other part of the patch was lazy loading content which I think was merged into the core codebase? So it makes sense that most of that was eliminated.
Still, I was hoping to see more adjustments that would pertain specifically to keeping content moderation easier given the inherent difficulty this instance faces
I was wondering about the "ensuring content is actually deleted" bit, as right now theres a bug causing deleted posts to show for people. That definitly will need to be resolved in terms of auto purging, we would have to look into it, but i would prefer an alternative solution or a higher bar than just certain types of reports ( like suspending the content until reviewed, or autopurging only on a higher number of reports) that way someone cant just take down comunities by reporting everything.
I expect we'll need to patch our own tools, the nsfw angle is very specific that the core lemmy devs probably don't want to deal with. Let me know how I can assist and where