World News
A community for discussing events around the World
Rules:
-
Rule 1: posts have the following requirements:
- Post news articles only
- Video links are NOT articles and will be removed.
- Title must match the article headline
- Not United States Internal News
- Recent (Past 30 Days)
- Screenshots/links to other social media sites (Twitter/X/Facebook/Youtube/reddit, etc.) are explicitly forbidden, as are link shorteners.
-
Rule 2: Do not copy the entire article into your post. The key points in 1-2 paragraphs is allowed (even encouraged!), but large segments of articles posted in the body will result in the post being removed. If you have to stop and think "Is this fair use?", it probably isn't. Archive links, especially the ones created on link submission, are absolutely allowed but those that avoid paywalls are not.
-
Rule 3: Opinions articles, or Articles based on misinformation/propaganda may be removed. Sources that have a Low or Very Low factual reporting rating or MBFC Credibility Rating may be removed.
-
Rule 4: Posts or comments that are homophobic, transphobic, racist, sexist, anti-religious, or ableist will be removed. “Ironic” prejudice is just prejudiced.
-
Posts and comments must abide by the lemmy.world terms of service UPDATED AS OF 10/19
-
Rule 5: Keep it civil. It's OK to say the subject of an article is behaving like a (pejorative, pejorative). It's NOT OK to say another USER is (pejorative). Strong language is fine, just not directed at other members. Engage in good-faith and with respect! This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban.
Similarly, if you see posts along these lines, do not engage. Report them, block them, and live a happier life than they do. We see too many slapfights that boil down to "Mom! He's bugging me!" and "I'm not touching you!" Going forward, slapfights will result in removed comments and temp bans to cool off.
-
Rule 6: Memes, spam, other low effort posting, reposts, misinformation, advocating violence, off-topic, trolling, offensive, regarding the moderators or meta in content may be removed at any time.
-
Rule 7: We didn't USED to need a rule about how many posts one could make in a day, then someone posted NINETEEN articles in a single day. Not comments, FULL ARTICLES. If you're posting more than say, 10 or so, consider going outside and touching grass. We reserve the right to limit over-posting so a single user does not dominate the front page.
We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.
All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.
Lemmy World Partners
News [email protected]
Politics [email protected]
World Politics [email protected]
Recommendations
For Firefox users, there is media bias / propaganda / fact check plugin.
https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/
- Consider including the article’s mediabiasfactcheck.com/ link
view the rest of the comments
So this does bring up an interesting point that I haven't thought about - is it the depiction that matters, or is it the actual potential for victims that matters?
Consider the Catholic schoolgirl trope - if someone of legal age is depicted as being much younger, should that be treated in the same way as this case? This case is arguing that the depiction is what matters, instead of who is actually harmed.
Every country has different rules, standing on wikipedia.
Personally, I feel that if making completely fictitious depictions of child porn, where no one is harmed (think AI-generated, or by consenting adults depicting minors) was legal, it might actually prevent the real, harmful ones from being made, thus preventing harm.
At the same time, an argument could be made that increasing the availability of such a thing could land it in the eyes of a person who otherwise wouldn’t have seen it in the first place and problems could develop.
It could normalize something absurd and create more risks.
I’m no expert and I’d rather leave it to people who thoroughly understand such behaviors to determine what is and isn’t ultimately more or less detrimental to the health of society.
I just know how (anecdotally) pornography desensitizes a person until it makes more extreme things less bizarre and unnatural. I can’t help but imagine a teenager who would have otherwise developed a more healthy sexuality stumbling on images like that and becoming desensitized.
It’s definitely something that needs some serious thought.
"I’m no expert and I’d rather leave it to people who thoroughly understand such behaviors to determine what is and isn’t ultimately more or less detrimental to the health of society."
One of the big problems with addressing this problem is that NOBODY thoroughly understands these behaviors. They are so stigmatized that essentially nobody voluntarily admits to having pedophilic urges and scientists can only study those who actually act on them and harm children. They are almost certainly not a representative sample of the entire population of pedophiles, and this severely limits our ability to study the psychology of the population as a whole and what differentiates the rapists among them from the non-rapists.
I think Japan would make a really good case study. Childlike aesthetics and behaviors are strongly sexualized in Japan. They also produce the most simulated CSAM per capita with few laws restricting production. Actual child pornography wasn't made illegal until 1999. They still sell photo books of tweens in swimsuits and stuff in Japan. That, and lolicon, which is basically hentai with kids in it.
There isn't the same stigma against attraction to children, and we see that some 15-20% of the Japanese male population holds some aesthetic preferences that most westerners would consider pedophilic.
I think we'd probably see similar numbers in America if we could cut though the stigma, which some people would panic over, but if anything we should be relieved that despite such numbers, actual sexual abuse of children is very rare.
I mean, the writing is on the wall already. Nothing in the West is more sexualized than youth, we just like to pretend that 18 is some magical age where you looked completely different the day before your birthday or something, and ignore that puberty comes a lot earlier than that.
What really matters is the social norms surrounding these things. We shouldn't care if a 40 year old man thinks a 15 year old girl is attractive, we should care if he tries to do anything about that attraction, because the latter is a conscious choice that does harm, while the former is more complex matter of human sexual response.
Most of what you're repeating about porn "normalizing" things and "desensitizing" viewers is straight out of the puritan handbook. There is evidence that men who overconsume porn and don't have a healthy sex life can fall into self-destructive patterns, but porn consumption doesn't work like a drug. It's not like the more you consume the more hardcore of content you desire, or that being exposed to certain types of porn will create new preferences that you wouldn't otherwise have had. This is just long-standing anti-sex-work propaganda that tries to liken pornography to narcotics.
People who consume CSAM are already into that kind of thing. Seeing CSAM isn't going to turn anyone into a pedophile just as playing GTA isn't going to turn anyone into a hardened street criminal. The goal should be to protect children, not to censor any content that sexualizes youth, because that really is a slippery slope. More on that here: https://nypost.com/2010/04/24/a-trial-star-is-porn/
Yeah, valid points, but it's not gonna be easy to tell, in practice. Doing a proper scientific test is likely going to be unethical for obvious reasons, so we're left to wonder if the cons outweigh the pros or not.
Thanks for sharing that link. I hated reading through it, but it answered the question haha...
I don't really have strong feelings about it but I do think I lean towards agreeing with you.
How I see it: creating fake child porn makes it harder for authorities to find the real ones.
That's a good point. On the flip side, I remember there was a big deal about trying to flood the rhino horn market with fakes a few years ago. I can't find anything on how that went, but I wonder if it could have that effect as well.
Also makes it harder for offenders to find the real ones!
In America at least, people often confuse child pornography laws with obscenity laws, and they do end up missing the point. Obscenity laws are a violation of free speech, but that's not what a CSAM ban is about. It's about criminalizing the abuse of children as thoroughly as possible. Being in porn requires consent, and children can't consent, so even the distribution or basic retention of this content violates a child's rights.
Which is why the courts have thrown out lolicon bans on First Amendment grounds every time it's attempted. Simulated CSAM lacks a child whose rights could be violated, and generally meets all the the definitions of art, which would be protected expression no matter how offensive.
It's a sensitive subject that most people don't see nuance in. It's hard to admit that pedophilia isn't a criminal act by itself, but only when an actual child is made a victim, or a conspiracy to victimize children is uncovered.
With that said, we don't have much of a description of the South Korean man's offenses, and South Korea iirc has similar laws to the US on this matter. It is very possible that he was modifying real pictures of children with infill or training models using pictures of a real child to generate fake porn of a real child. This would introduce a real child as victim, so it's my theory on what this guy was doing. Probably on a public image generator service that flagged his uploads.
The intent is to get off on fucking children, how you make that happen shouldnt matter
So would that include written stories?
If we decide that nothing else matters but protecting children, then protecting children will be the only thing that matters anymore. That's not a reasonable outcome.