view the rest of the comments
Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com.
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.
6) No US Politics.
Please don't post about current US Politics. If you need to do this, try !politicaldiscussion@lemmy.world or !askusa@discuss.online
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
By not allowing parents to outsource the responsibilities of being a parent.
I'll reply to this random one with that statement. There's no winning move as a parent.
Problem is being locked out. If your kid is the only one not on social media and all other kids are, your kid will be socially left out.
All kids are on a chat platform you don't support. What do you? Disallow it and give them a social handicap that might scar them, or allow it and take the risk?
The same goes for allowing images on other platforms. Since GDPR schools seem to care. Yet if it's a recording that will be put on social media you can explain your 4 year old why they weren't allowed to participate... It sucks.
I don't know what the right way forward is. I don't think this is it. Something is needed though. We should at least signal what we find acceptable as a society. Bog stupid rules which are trivial to circumvent might be good enough, or perhaps some add campaigns like we did with smoking (hehe, if it's for something we support then adds are good?).
Regardless, the current situation clearly doesn't work. It would be great if we could find and promote the least invasive solutions.
I feel that communicating your concerns with other parents and their school can help. I feel it can make sense to have some forms of socialization when they are in middle school or high school, but even then you’d want a pretty locked down system, imo.
I feel that not every parent is going to let their kids use technologically to talk to their friends, especially not all the time. That’s not how I grew up and I was fine developmentally speaking. As a parent you can seek out other parents that live by similar philosophy locally for your kids to have as friends as well.
You'd be surprised with what parents let their kids do. My little anecdotal sample size contains mostly highly educated people but most of them don't place any restrictions on screen time of their kids. They claim they talked to their kids and they have assured them they don't look at anything they are not supposed to but that's just not what happens in reality.
What really happens is that the kids with no restrictions will engage with all the predatory bullshit on these platforms, nonstop. I can see this with my own eyes and my kid brings their friends over.
Communication is key but unfortunately the business model of these platforms is based on addiction and children are not equiped to deal with it and parental controls are an essential component.
I believe the parent post is nicely sketching out what a "best" move is. I have seen no better approach myself. At the same time I see what you see. The best approach isn't all that great. If you're lucky and find the right people it could work. There's a lot of luck involved there.
That's why I do think there should be some regulations indicating what is tolerated. It seems to me parent poster may agree (and thus also woth your take).
Since GDPR you can tell the school you don't want pictures on platforms you disagree with. You may miss out on seeing the photo's, you might come across as crazy, but you can (and you should). We were given a choice at the cost of extra paperwork and some limitations.
Even without the addiction problem of these platforms we should nurture and find a good society around us. It's a valid take to try and find likeminded people.
I don't think that's the end of it. Given the state we're in, the network effect, and the fragile ego of developing kids, I suppose we need a stronger push.
AI enforced age verification or logins which allow you to be followed anywhere is not the solution in my current opinion, it's a different problem. The problems are the addictive and steering nature of the platforms which seems to be hard to describe in a clear way legally.
I wonder how "these platforms" should be defined and what minimum set of limitations would give us and the children the necessary breathing space.
Wholeheartedly agree that the problem is the addictive and predatory nature of these platforms. I don't see how that would change under the current perpetual growth economy we all live under
the minimum would be transparency for the algorithm. If users can see exactly what a social media algorithm is doing with their content feed, they would always have a way to identify and escape dark patterns of addiction.
But this minimum itself would require powers to compel tech companies to give up what they would describe as intellectual property. Which would probably require a digital bill of rights?
The most practical option would be to just ask your kids directly about the kinds of content they've been consuming and why. Dinner table conversations can probably reveal those dark patterns just as well
Well said