this post was submitted on 25 Apr 2024
241 points (96.9% liked)
Linux
48255 readers
432 users here now
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Rules
- Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
- No misinformation
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
On X11 apps can scan and read what they want. This is not even very good, but developers dont need to implement accessibility really, just make all text scannable.
If this is a screenreader you are talking about.
Apps need to send the reader specific texts that shouls be read, like push notifications. And this needs to be implemented, because on Wayland no app can just scan everything.
So rather than having one single app that deals with screen reading, it's now down to every individual application to make accessibility a priority.
Huge retrograde step.
We can all agree that authors should all value accessibility, but we also all know that they won't.
GUI frameworks should implement this, just like any app built on GTK, Qt, Iced or possibly others have native wayland support.
But yes I agree this is not a good situation. There should be something like "accessibility permission" on Android, where apps can basically read anything.
So because they won't, those who need accessibility, will require x.org.... forever?
That's one of the huge problems with Wayland. The core protocol is super minimalistic so it falls to each and every individual app to (re)implement everything: accessibility, clipboard, keyboard, mouse, compositing etc. etc.
The fact this was done in the name of security is a solution looking for a problem. Inter-window communication was never a stringent security issue on Linux.
It's like advising people to wear helmets in their everyday life. Sure, in theory it's a great idea and would greatly benefit those who slip and fall, or a flower pot falls on their heads, or are in a car accident and so on. But in practice it would be a huge inconvenience 99.99% of the time.
The largest part of all Linux apps out there will never get around to (re)implementing all this basic functionality just to deal with a 0.01% chance of security issues. Wherever convenience fights security, convenience wins. Wayland will either come around or become a bubble where 99% of Linux userland doesn't work.
I haven't read so much nonsense packed in a single sentence in a while. No, apps don't implement any of these things themselves. How the fuck would apps simultaneously "implement compositing themselves" and also neither have access to the "framebuffer" (which isn't even the case on Xorg!) nor information about other windows on the screen?
Please, don't rant about things you clearly don't know anything about.