this post was submitted on 02 Sep 2024
425 points (96.7% liked)
Technology
59038 readers
3747 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'm genuinely surprised why the UK haven't already
Under what law?
UK currently holds the people that post things liable for their own words. X, the platform, just relays what is said. Same as Lemmy. Same as Mastodon.
If you ban X I don't see why those other platforms wouldn't be next.
Now should people/organisations/companies leave X? Absolutely! Evacuate like it's a house of fire. Should it be shut down by legal means? No.
An argument being made in another social media case (involving TikTok) is that algorithmic feeds of other users' content are effectively new content, created by the platform. So if Twitter does anything other than a chronological sorting, it could be considered to be making its own, deliberately-produced content, since they're now in control of what you see and when you see it. Depending on how the TikTok argument gets interpreted in the courts, it could possibly affect how Twitter can operate in the future.
It's certainly arguable that the algorithm constitutes an editorial process and so that opens them up to libel laws and to liability.
Fair point.
That argument is being made in the USA, not the UK.
https://www.motherjones.com/politics/2024/08/federal-court-tiktok-230-liable-blackout-challenge-nylah-anderson-death/
Let’s say this goes through, how is a company going to prove it is not using an “algorithmic feed” unless they open source their code and/or provide some public interface to test and validate feed content?
Plus, even without an “algorithmic feed”, couldn’t some third party using bots control a simple chronological or upvote/like-based feed? And then those third parties, via contracts and agreements, would manipulate the content rather than the social media owner itself.
This honestly seems like a good idea. I think one of the ways to mitigate the harm of algorithmically driven content feeds is openness and transparency.
Well for the end users and any regulators it’s a great idea. But the companies aren’t going to go along with this.
Then they must be held liable for what they allow to spread on their platforms
Twitter (or rather musk) chooses what it "relays" or boosts. Unlike lemmy, unlike Mastodon.
The Australian Government issued a bunch of take down notices to Twitter and Musk said no
https://www.abc.net.au/news/2024-04-23/what-can-the-government-do-about-x/103752600
Musk decided to block them in Australian only which didn't satisfy the Australian Government
He took them to court and the court sided with Twitter, (x)
https://variety.com/2024/digital/news/australian-court-elon-musk-x-freedom-of-speech-row-1236000561/
Here's the thing about nation state governments. They can pass laws. It's kind of the main thing they do.
They retain authority by having some air of legitimacy. They can't just change laws, there has to be a due process just changing laws without a process is literally a dictatorship.
I agree. It would set a terrible precedent, even if it’s terribly tempting. I’d say it’s better to ask people to leave instead.
I'm sure they would like to but they don't really have the authority.