this post was submitted on 08 Jan 2025
20 points (100.0% liked)

Politics

10198 readers
209 users here now

In-depth political discussion from around the world; if it's a political happening, you can post it here.


Guidelines for submissions:

These guidelines will be enforced on a know-it-when-I-see-it basis.


Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
top 7 comments
sorted by: hot top controversial new old
[–] [email protected] 2 points 4 hours ago (1 children)

There's no such thing as a brainwashing machine. Brainwashing is a very specific process that doesn't last long outside of the torture or specific scenario that creates it.

What they mean is a conditioning machine.

[–] [email protected] 3 points 3 hours ago* (last edited 3 hours ago) (1 children)

You're right, but I think they are using the term "brainwashing" in a colloquial sense. There's a perception that misinformation on the internet is persuading people into more extreme views, but what the author of this article is arguing is that what is happening more is that online misinformation is allowing people to easily justify beliefs that they have already formed, and quickly and easily get rid of cognitive dissonance associated with encountering information that contradicts their beliefs. This is something that people have always done, but it's become so easy on the modern internet that more and more people are embracing fringe worldviews who might previously have been unable to cognitively support those views.

It's a small difference in the way we think about misinformation online, but I think it's important that we understand what is likely happening. It's not so much that misinformation is changing people's beliefs, but that it's allowing people to hang onto beliefs that contradict reality more easily.

[–] [email protected] 2 points 3 hours ago (1 children)

Excellent reply.

Honestly, I think it's both. But you may be correct, sadly that's it's more about being able to hang on to beliefs for the most part.

[–] [email protected] 2 points 3 hours ago

I actually also think it's probably both, to a degree, that's just not what the author of the article is arguing. I think there's probably a certain amount of persuasion that is pulling people deeper into a belief system that they might only be partially invested in at first, and then they are sucked into ecosystems that reinforce those beliefs and pull them further in. I don't have anything but vibes and lots of half-remembered reading about online radicalization, though.

[–] [email protected] 11 points 1 day ago (2 children)

Confronted with information that could shake their worldviews, people can now search for confirming evidence and mainline conspiracist feeds or decontextualized videos. They can ask AI and their favorite influencers to tell them why they are right. They can build tailored feeds and watch as algorithms deliver what they’re looking for. And they will be overwhelmed with data.

This is, precisely, what morons do. How do you fix stupid?

[–] [email protected] 3 points 3 hours ago

Alter the incentive structures of the systems so as to make it unappealing to pedal sensationalism. Change how algorithms work for promoting content so it doesn’t reward rage bait. Don’t have bad behavior be rewarded with attention, especially not when attention is money.

[–] [email protected] 4 points 20 hours ago

Re-wash their brain. Get it clean.