396
Rule 34 rule (lemmy.world)
submitted 1 week ago* (last edited 1 week ago) by [email protected] to c/[email protected]

Edit: WE DON'T TALK ABOUT NUMBER 11.

you are viewing a single comment's thread
view the rest of the comments
[-] [email protected] 45 points 1 week ago

Controversial opinion: If killing npcs in video games is fine and shouldn't land you in prison for murder, because they are fictional and not real people, then porn of "underage" fictional characters is also fine and shouldn't be illegal.

Finding something disgusting is not a proper reason to make something illegal. The only relevant aspect is whether it causes harm to others or not.

  • Csam harms children -> should be illegal and punished
  • fictional drawings don't harm anyone bevause no actual people involved -> should be legal
[-] [email protected] 30 points 1 week ago

I'm not talking about legal judgement (I practically never do) and I'm not even talking bout legit PDF files.

It's just disgusting that rule 34 of literal children scores this high.

[-] [email protected] 35 points 1 week ago
[-] [email protected] 17 points 1 week ago

But saying PDF files is funny

[-] [email protected] 23 points 1 week ago

Don't discrace the good name of my friend, Portable Document Format

[-] [email protected] 11 points 1 week ago

Adobe did that a long time ago.

[-] [email protected] 7 points 1 week ago

Can we start calling them Adobe Lovers?

[-] [email protected] 11 points 1 week ago

Legality and morality don't necessarily align. I would find it very immoral, but as far as I know, not illegal, to get off to drawings of children. Additionally, what's the difference between a photo of a child and a realistic drawing of a "fictional character" that looks like said child? I think getting off to children is wrong, regardless of criminality. If that's something someone desires, they should seek help, not indulge in fantasy.

[-] [email protected] 10 points 1 week ago* (last edited 1 week ago)

to get off to drawings of children

They are not drawings of children, they are drawings of fictional characters that look like children. That is an important distinction here I think. Obviously getting off to a drawing of a real child is wrong.

what's the difference between a photo of a child and a realistic drawing of a "fictional character" that looks like said child?

That's my whole point, it makes all the difference. One is an actual human person that feels emotions and is harmed by the creation and spread of csam, while the other literally doesn't exist.

That's why I think it is not actually immoral. (I believe morality and legality should align anyways) Then again that's why we watch fictional shows in the first place

I think your disgust might come from anthropomorphising the fictional character and feeling empathy towards it?
(Of course you are entitled to your feelings)

[-] [email protected] 2 points 1 week ago

I have always felt the "actually she's 1000 years old and just looks like a child" argument is both ridiculous and disingenuous. They're interested because she looks like a child, not because of her character supposed age. Again, but rephrased, what's the difference if someone makes a character that looks like a real child but is fictional and much older in their characterization? At what point is it morally acceptable? Do you need to use an ambiguous art style? Do you need to include inhuman character traits? I simply cannot take the argument seriously, because clearly the character looking like a child is important. What difference does the story you tell yourself about their age make? Why not just pretend real CSAM is just young looking aliens that are a million years old? If it looks like a child, I believe it's unequivocally immoral, and there is no line you can draw that would convince me that a childlike drawing that falls on the "OK" side of the line isn't immoral.

[-] [email protected] 1 points 6 days ago

At least based on the actual psychology research on the topic, access to fictional material for masturbation purposes actually has shown to be the most effective method to prevent abusive urges and relapse.

Tho considering how hard it is to find funding and people willing to under go therapy it's a struggle to find reliable data.

There was one of the leading experts in the topic that did an ask me anything a few years back.

Reddit ended up banning her and nuking the thread due to the topic. But it had a lot of research shared in it that now is locked up behind pay walls and basically buried deep.

[-] [email protected] 1 points 5 days ago

I would need to see multiple, peer reviewed sources. It's a very sensitive topic, but regardless, I don't believe that that material should be floating around. Access it through a psychiatrist, or therapist.

[-] [email protected] 1 points 1 week ago

I have always felt the "actually she's 1000 years old and just looks like a child" argument is both ridiculous and disingenuous

I haven't made that argument

[-] [email protected] 8 points 1 week ago

It's the same argument, that the character only looks like a child, but isn't. I chose a hyperbolic example for emphasis, but it's the same argument. It looks like a child. That's the point.

[-] [email protected] 6 points 1 week ago

In the UK it doesn't matter if its a photo of a real person or not, porn that depicts a child is illegal.

Tbh I think with the use of AI at this point this might be a pretty good law to have. "It was AI generated" is not a defence. Realistically if you are doing it for yourself no one will find out so there is some kind of argument as to it being harmless, but when you start doing other things with those images such as using them for blackmail the police should be able to use that as sufficient evidence to charge you.

[-] [email protected] 0 points 1 week ago

That's just the UK though. They have their own issues

[-] [email protected] 2 points 1 week ago

We certainly have issues, but intolerance of paedophilia isn't one of them. I'm happy we see it that way.

[-] [email protected] 0 points 1 week ago

That's the thing. Y'all do tolerate it. There's article after article about letting them get away with just a slap on the wrist

[-] [email protected] 1 points 1 week ago* (last edited 1 week ago)

Yeah, definitely far from perfect

...or even good, a lot of the time

[-] [email protected] 3 points 1 week ago

I still find it fucked up that so many people are aroused by sexualising children, even though they are fictional.

[-] [email protected] 2 points 1 week ago

loli haet pizza

[-] [email protected] -5 points 1 week ago

I'm normally an anti-slippery slope person but there's a definite escalatory nature to how paedophiles operate, it's easy to see how images of fictional children can evolve into images of actual children.

[-] [email protected] 27 points 1 week ago

How is this argument different than the "video games cause violence" argument?

[-] [email protected] -3 points 1 week ago

Because as I said, people with an interest in CSAM tend to escalate in their behaviour. Most don't jump straight into child pornography but start with less serious things like jailbait and non-sexualised images of children etc. It doesn't take a huge leap to see the same pathway with images of fictional children.

I'm not suggesting that everyone into this material will go on to abuse children or start to consume actual CSAM but there's a non-zero amount of actual paedophiles who started their journey with this exact material.

[-] [email protected] 9 points 1 week ago

It's that the same with violence, people that want to commit acts of violence may start by acting out violence in video games?

[-] [email protected] -3 points 1 week ago* (last edited 1 week ago)

The "video games cause violence" argument is wrong, because the vast majority of gamers don't try to use games as a substitute action for violent behaviour.

But there are of cause at least some mass murderers and school shooters, that have played violent games in order to fulfill their violent phantasies, couldn't do so in a long term and murdered real people instead.

Same goes with pedophiles. They want to fuck a child, use fictional characters to fulfill the phanstasy, get used to it and then escalate to pictures of real children and eventually real children.

[-] [email protected] 5 points 1 week ago

Do you have any real-world or professional experience with people suffering from pedophilia?

I don't, but I doubt that stance as being unfairly projecting some idea of how things have to work on those people.

I for one can jerk off to the weirdest porn fantasy things, from the usual thinly-veiled step-siblings trope to rape play, weird power dynamics or tentacle porn (depictions of children not among them, I feel I have to mention that explicitly here), but I don't have any desire to experience any of that in real life, because I absolutely know it would be wrong.

I don't see why that should neccessarily be different for them.

[-] [email protected] 5 points 1 week ago* (last edited 1 week ago)

Do you have any real-world or professional experience with people suffering from pedophilia?

I have been schooled on pedophilia in a professional context, during my undergrad and in a work-context. Yet I have not worked with pedophiles directly, nor have I had any contact to one, that I would be aware of.

Looking at the science both of our positions are reflected. As with so many things the answer is not a simple "yes" or "no".

If you want to take a bit of a deep dive, I recoment this study from 2023.

It looked at both of our positions: FSM (Fantasy Sexual Material) leads to real sexual violence against children vs. FSM reduces the risk of said practice.

Here is their summary of my position (FSM leads to sexual violence against children):

When applying the motivation-facilitation model to the context of FSM use, it can be theorized as to why, for a subset of users, engaging with such material could become problematic and increase the likelihood of committing a child sexual offense (whether that be offline or online), while for others this is not the case. As pedohebephilia is thought to be a motivating factor towards sexual offending in Seto’s model [21], engaging with FSM relating to children could heighten sexual arousal and therefore act as a facilitator to increase offending likelihood. With abstinence from masturbation being self-reported as a risk-management technique by some people who are attracted to children [16], this is a recognized idea by some members of the community. Over time, engaging in CSEM (especially forms such as child-like sex dolls, which offer a more realistic sexual experience) may contribute to the development of offence-supportive beliefs and the adoption of implicit theories about the acceptability of engaging in sexual activity with children (or child-like targets). The combination of enhanced sexual arousal to children (a potential motivator of offending), coupled with the development of permission-giving beliefs (facilitators of offending), may subsequently increase the risk of abuse being committed by somebody with attractions to children.

Here they are summarising your position, as far as I understand it (FSM helps to prevent sexual child abuse):

Alternatively, FSM use could be seen as beneficial by the motivation-facilitation model and instead reduce the likelihood of offending. Rather than heightening arousal, FSM could act as a safe sexual outlet that allows for a feeling of release and sense of catharsis [84], which could reduce a motivation to seek out real children as a sexual partner. Engaging with FSM also avoids the problematic suppression of sexual interests, with such suppression being linked to increases in self-perceived risks for offending among those with attractions to children [15]. In contrast to Stevens and Wood [16], Houtepen and colleagues reported that engagement in masturbatory fantasies was a common coping mechanism used by some people experiencing attractions to children, avoiding the need to access CSEM due to an alternative outlet being identified [3•].

They conclude, basicly, that more research is needed:

Given the present lack of understanding of FSM and how they are used, it is important to identify the factors associated with use and whether they are risk-enhancing or risk-reducing (i.e., protective). This knowledge could be beneficial to clinicians in the search for more effective methods to support people who are attracted to children when they are seeking help to manage their sexual interests. Nonetheless, Seto’s motivation-facilitation model provides a theoretical framework for thinking about this topic in a more nuanced way [21].

So, there we are. A long post to say maybe.

I suppose both of our positions might be viable and it can't be said yet, under what circumstances each one of us might be right.

[-] [email protected] 3 points 1 week ago

Thanks for putting the work in, to coin such a well-rounded response on a topic that is usually easily derailed.
You seriously exceeded my expectations.

I cannot read through that whole review right now, but it was an interesting read already for as far as I got, taking into account a lot of nuance to the topic, that one doesn't usually find in public debates. (Or at least, I wouldn't expect to find on a taboo topic like this. It's not like that's something I regularly discuss, just something I had thoughts about when reading comments under a random meme.

I guess I have nothing to add. You clearly understood my meaning very well and as the paper states, there's probably more research needed. I'd hypothesize, that engaging with such material can lead to both outcomes, just as it can for depictions of other illegal activities. The case might be warped however, because people who are exclusively attracted to children, have no way of acting out on that with a real person, whithout it being illegal and immoral, while people who are (also) attracted to adults have a plethora of possibilities. (And can live out some of their sexuality at least, even if they have trouble finding a partner that consensually takes part in their most extreme kinks)

So yeah, complex issue. Thanks for your valuable contribution.

[-] [email protected] 1 points 1 week ago

So yeah, complex issue. Thanks for your valuable contribution.

On the contrary, I was happy to engage with a position that wasn't along the lines of "all pedophiles should commit suicide".

It is, as you say, a complex issue. That complexity and any nuance is usually absent of discussions about the topic on the Internet. Thanks for adding it here.

this post was submitted on 24 May 2025
396 points (96.9% liked)

Lemmy Shitpost

31854 readers
4732 users here now

Welcome to Lemmy Shitpost. Here you can shitpost to your hearts content.

Anything and everything goes. Memes, Jokes, Vents and Banter. Though we still have to comply with lemmy.world instance rules. So behave!


Rules:

1. Be Respectful


Refrain from using harmful language pertaining to a protected characteristic: e.g. race, gender, sexuality, disability or religion.

Refrain from being argumentative when responding or commenting to posts/replies. Personal attacks are not welcome here.

...


2. No Illegal Content


Content that violates the law. Any post/comment found to be in breach of common law will be removed and given to the authorities if required.

That means:

-No promoting violence/threats against any individuals

-No CSA content or Revenge Porn

-No sharing private/personal information (Doxxing)

...


3. No Spam


Posting the same post, no matter the intent is against the rules.

-If you have posted content, please refrain from re-posting said content within this community.

-Do not spam posts with intent to harass, annoy, bully, advertise, scam or harm this community.

-No posting Scams/Advertisements/Phishing Links/IP Grabbers

-No Bots, Bots will be banned from the community.

...


4. No Porn/ExplicitContent


-Do not post explicit content. Lemmy.World is not the instance for NSFW content.

-Do not post Gore or Shock Content.

...


5. No Enciting Harassment,Brigading, Doxxing or Witch Hunts


-Do not Brigade other Communities

-No calls to action against other communities/users within Lemmy or outside of Lemmy.

-No Witch Hunts against users/communities.

-No content that harasses members within or outside of the community.

...


6. NSFW should be behind NSFW tags.


-Content that is NSFW should be behind NSFW tags.

-Content that might be distressing should be kept behind NSFW tags.

...

If you see content that is a breach of the rules, please flag and report the comment and a moderator will take action where they can.


Also check out:

Partnered Communities:

1.Memes

2.Lemmy Review

3.Mildly Infuriating

4.Lemmy Be Wholesome

5.No Stupid Questions

6.You Should Know

7.Comedy Heaven

8.Credible Defense

9.Ten Forward

10.LinuxMemes (Linux themed memes)


Reach out to

All communities included on the sidebar are to be made in compliance with the instance rules. Striker

founded 2 years ago
MODERATORS