Stallman Was Right

1347 readers
13 users here now

Nobody listens to him. But he was right all along.

founded 2 years ago
MODERATORS
1
2
 
 

cross-posted from: https://programming.dev/post/13465911

Hi,

I'm confuse about those mandatory legal notices that governments impose for websites..

Before going further I invite you to read:
A Declaration of the Independence of Cyberspace
and
Discourse on Voluntary Servitude[^1] \

From all the articles^2 that I read about the mandatory notice to display for website none of them reference the URL of their claim !! / of the legal text !! WTF[^links]

Internet is by essence world wide, and when reading all those legal requirement it's seem that you should display notices for EVERY country !

it's seem also that if you own a private website, just for your own or family use, like for example a web file hosting services. (NextCloud etc..) You should comply with the same requirement that are asked for company ! again... WTF !

Also I don't understand, why make mandatory those notices...(beside the scam (money) ) , I'll come back to this below.

  • If you want to buy something off a website, and this later do not mentions any legal address , contact info and so on, the responsibility to buy or not should be only yours. (For example, will you buy a yogurt in the supermarket if there were no brand, contact info on the packing or bill ?)
  • if the state want to ~~censor~~ "regulate" a website on the old internet[^OI] there is plenty of way to know who is the author or at the very least where is it hosted..
  • if a website use/distribute a copyrighted© elements. The right holder can do/contact in the following order:
    • check the website for contact (if any)
    • check the DNS record
    • check the hosting
    • contact the owner of the IP (IP are leased by company../ ISP )

So there is no sense to ask everyone that extra heavy burden. The only advantage is for law firm (and those cookies related firm) that make a profit out of it. I heard in my entourage peoples that had pay thousandth of $$ to generate those text, keep up to date etc.. even for small website.

  • If you think those legal notice are a good thing please do not hesitate to motivate your answer.
  • If you have any good links about it, feel free to share.
  • What are you doing your self on website of customer and/or for your private websites ?
  • if you know a Lemmy community worth to share this post, step forward.

Thanks...

[^1]:https://en.wikipedia.org/wiki/Discourse_on_Voluntary_Servitude
https://archive.org/details/0000-00-00-00-etienne-de-la-boetie-00_202201/1548-00-00_Discourse%20on%20Voluntary%20Servitude_1942_org/mode/2up \

https://www.websitepolicies.com/blog/legal-requirements-for-websites [^OI]:The one that you are using now with the domains scam. A future internet might be using TOR or GNU Name System

[^links]: if you have those links feel free to share !

3
4
5
 
 

cross-posted from: https://mbin.grits.dev/m/[email protected]/t/95555

Edit: Guys I didn't write the headline; the subtitle that I added, I've now fixed tho

Edit: Also, the information about there being no escape is out of date -- here's a quick guide to how to fix the problem in the modern day

6
7
 
 

Didn't GNU project start because of a printer?

8
9
10
11
 
 

Users notice that Zoom changes privacy policy to expressly allow them to use your private Zoom chats, video calls and other services to train A.I. and this IS NO OPT OUT.

12
 
 

cross-posted from: https://lemmy.ml/post/2437896

also on r/privacy

13
 
 

also on r/programming

14
 
 
15
 
 

cross-posted from: https://lemmy.world/post/1381019

The article about the "subscription" HP ink made me realise something.

Subscriptions aren't a new idea at all. You could subscribe to paper magazines. And you got to keep them.

I'm just clearing up my old house and it's filled with tons of old tech magazines. Lots of useful knowledge here. Wanna know how Windows and Mac compared in 1993? It's in here. All the forgotten technologies? Old games, old phones, whatever? You'll find it.

Now, granted. You'd only get one magazine a month. Not a whole library of movies or games or comic books.

But still, the very definition of subscription has shifted. Now, the common meaning is "you only get to use these things as long as you're paying". Nobody even thinks it could mean anything else.

Besides, it doesn't only apply to services that offer entire libraries. Online magazines still exist in a similar form as the paper ones. But you only get to access them while your "subscription" is active. Even the stuff you had while you were paying.

BTW I'm not throwing my old magazines away. I won't have the space, but a friend is taking it all. If they wouldn't, I'd give them to a library or let someone take them. The online and streaming stuff of today and tomorrow? In 30 years it'll be gone, forgotten and inaccessible.

16
 
 

also from r/StallmanWasRight

17
 
 

cross-posted from: https://lemmy.ml/post/1874795

cross-posted from: https://lemmy.ml/post/1874605

A 17-year-old from Nebraska and her mother are facing criminal charges including performing an illegal abortion and concealing a dead body after police obtained the pair’s private chat history from Facebook, court documents published by Motherboard show.

18
 
 

For the past 20 years UK Post Office employees have been dealing with a piece of software called Horizon, which had a fatal flaw: bugs that made it look like employees stole tens of thousands of British pounds. This led to some local postmasters being convicted of crimes, even being sent to prison, because the Post Office doggedly insisted the software could be trusted. After fighting for decades, 39 people are finally having their convictions overturned, after what is reportedly the largest miscarriage of justice that the UK has ever seen.

19
 
 

Louis Rossmann bringing us good summary again

20
 
 

cross-posted from: https://lemmy.ml/post/1438631

yay we are all terrorists
direct Odysee link

21
 
 

I would like to build bust of stallman from concrete. The plan is to 3D print him and then make a mold. Then i would bolt that bust in place infront of shity software companys with some of his quotes. So how do i get a 3D model of him to start?

22
23
 
 

cross-posted from: https://lemmy.ml/post/1366662

Richard Stallman was right since the very beginning. Every warning, every prophecy realised. And, worst of all, he had the solution since the start. The problem is not Richard Stallman or the Free Software Foundation. The problem is us. The problem is that we didn’t listen.

24
 
 

The algorithm used for the cash relief program is broken, a Human Rights Watch report found.

A program spearheaded by the World Bank that uses algorithmic decision-making to means-test poverty relief money is failing the very people it’s intended to protect, according to a new report by Human Rights Watch. The anti-poverty program in question, known as the Unified Cash Transfer Program, was put in place by the Jordanian government.

Having software systems make important choices is often billed as a means of making those choices more rational, fair, and effective. In the case of the poverty relief program, however, the Human Rights Watch investigation found the algorithm relies on stereotypes and faulty assumptions about poverty.

“Its formula also flattens the economic complexity of people’s lives into a crude ranking.”

“The problem is not merely that the algorithm relies on inaccurate and unreliable data about people’s finances,” the report found. “Its formula also flattens the economic complexity of people’s lives into a crude ranking that pits one household against another, fueling social tension and perceptions of unfairness.” Join Our Newsletter Original reporting. Fearless journalism. Delivered to you. I'm in

The program, known in Jordan as Takaful, is meant to solve a real problem: The World Bank provided the Jordanian state with a multibillion-dollar poverty relief loan, but it’s impossible for the loan to cover all of Jordan’s needs.

Without enough cash to cut every needy Jordanian a check, Takaful works by analyzing the household income and expenses of every applicant, along with nearly 60 socioeconomic factors like electricity use, car ownership, business licenses, employment history, illness, and gender. These responses are then ranked — using a secret algorithm — to automatically determine who are the poorest and most deserving of relief. The idea is that such a sorting algorithm would direct cash to the most vulnerable Jordanians who are in most dire need of it. According to Human Rights Watch, the algorithm is broken.

The rights group’s investigation found that car ownership seems to be a disqualifying factor for many Takaful applicants, even if they are too poor to buy gas to drive the car.

Similarly, applicants are penalized for using electricity and water based on the presumption that their ability to afford utility payments is evidence that they are not as destitute as those who can’t. The Human Rights Watch report, however, explains that sometimes electricity usage is high precisely for poverty-related reasons. “For example, a 2020 study of housing sustainability in Amman found that almost 75 percent of low-to-middle income households surveyed lived in apartments with poor thermal insulation, making them more expensive to heat.”

In other cases, one Jordanian household may be using more electricity than their neighbors because they are stuck with old, energy-inefficient home appliances.

Beyond the technical problems with Takaful itself are the knock-on effects of digital means-testing. The report notes that many people in dire need of relief money lack the internet access to even apply for it, requiring them to find, or pay for, a ride to an internet café, where they are subject to further fees and charges to get online.

“Who needs money?” asked one 29-year-old Jordanian Takaful recipient who spoke to Human Rights Watch. “The people who really don’t know how [to apply] or don’t have internet or computer access.”

Human Rights Watch also faulted Takaful’s insistence that applicants’ self-reported income match up exactly with their self-reported household expenses, which “fails to recognize how people struggle to make ends meet, or their reliance on credit, support from family, and other ad hoc measures to bridge the gap.”

The report found that the rigidity of this step forced people to simply fudge the numbers so that their applications would even be processed, undermining the algorithm’s illusion of objectivity. “Forcing people to mold their hardships to fit the algorithm’s calculus of need,” the report said, “undermines Takaful’s targeting accuracy, and claims by the government and the World Bank that this is the most effective way to maximize limited resources.” Related AI Tries (and Fails) to Detect Weapons in Schools

The report, based on 70 interviews with Takaful applicants, Jordanian government workers, and World Bank personnel, emphasizes that the system is part of a broader trend by the World Bank to popularize algorithmically means-tested social benefits over universal programs throughout the developing economies in the so-called Global South.

Confounding the dysfunction of an algorithmic program like Takaful is the increasingly held naïve assumption that automated decision-making software is so sophisticated that its results are less likely to be faulty. Just as dazzled ChatGPT users often accept nonsense outputs from the chatbot because the concept of a convincing chatbot is so inherently impressive, artificial intelligence ethicists warn the veneer of automated intelligence surrounding automated welfare distribution leads to a similar myopia.

The Jordanian government’s official statement to Human Rights Watch defending Takaful’s underlying technology provides a perfect example: “The methodology categorizes poor households to 10 layers, starting from the poorest to the least poor, then each layer includes 100 sub-layers, using statistical analysis. Thus, resulting in 1,000 readings that differentiate amongst households’ unique welfare status and needs.”

“These are technical words that don’t make any sense together.”

When Human Rights Watch asked the Distributed AI Research Institute to review these remarks, Alex Hanna, the group’s director of research, concluded, “These are technical words that don’t make any sense together.” DAIR senior researcher Nyalleng Moorosi added, “I think they are using this language as technical obfuscation.”

As is the case with virtually all automated decision-making systems, while the people who designed Takaful insist on its fairness and functionality, they refuse to let anyone look under the hood. Though it’s known Takaful uses 57 different criteria to rank poorness, the report notes that the Jordanian National Aid Fund, which administers the system, “declined to disclose the full list of indicators and the specific weights assigned, saying that these were for internal purposes only and ‘constantly changing.’”

While fantastical visions of “Terminator”-like artificial intelligences have come to dominate public fears around automated decision-making, other technologists argue civil society ought to focus on real, current harms caused by systems like Takaful, not nightmare scenarios drawn from science fiction.

So long as the functionality of Takaful and its ilk remain government and corporate secrets, the extent of those risks will remain unknown.

25
view more: next ›