What the heck...? My CPU is none of their business.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
Google chooses codecs based on what it guesses your hardware will decode. (iPhones get HEVC, Android gets VP9, etc) They just didn’t put much thought into arm based home devices outside of a specific few like the shield.
Why wouldn't it be my browser asking for the codecs it prefers instead of the website trying to guess my computer's hardware ?
Lots of hardware lies about its useful capabilities.
Can you run 4k? Of course. But can you run more than 4 frames a second?
The browser can lie all they want, at the end of the day the user has the final word if they want to change things.
My by now rather ancient rk3399 board can hardware-decode both at 4k 60Hz. Which has nothing to do with the fact that it's aarch64, but that Rockchip included a beast of a VPU (it was originally designed for set-top boxes).
How about, dunno, asking the browser what kind of media it would prefer?
If you use any Google service, everything of yours is their business. You are their product, voluntarily.
this prolly wasnt a bad decision early on... why push something to a population who cant utilize it... but shit changes fast, google.
It seems somewhat damning that Google’s own browser had a workaround for this, though
was it ignorance or malicious intent?
if it was a person, i would try and assume ignorance.. im not sure google the company deserves such respect
Or it's a company so fuckoff huge that one department (Chrome on Android) couldn't get a bug report escalated in another department (YouTube). Eventually they just put in a UA workaround while the bug rots in a backlog somewhere. Common enterprise bullshit.
Or the Chrome on Android team didn't even bother reporting the issue to YouTube and just threw in a cheap workaround. Also common enterprise bullshit.
Bingo. When I was a Chrome developer working on video stuff, we mostly treated YouTube like a separate company. Getting our stuff to work with theirs was a priority, but no more than, say, Netflix. We pretty much treated them as a black box that consumed the same API we provided for everyone.
The weirder thing is Firefox on ARM being detected as a HiSense TV. I did a cursory search to see if HiSense ever used Firefox OS on the TV and it doesn't seem like it. Panasonic seemed to be the only manufacturer using it.
Could be that the developers for the HiSense TV just copy-pasted whatever UA into their browser codebase and called it a day.
YouTube is having a lot of totally not anticompetitive "bugs" in these past couple of weeks
UA sniffing again? What was it with feature detection and whatnot?
Does this include Apple Silicon Macs? That would be a bold move.
This issue was detected when running Firefox on Linux on Apple silicon. Firefox on Mac just identifies as x64.
It's probably not on purpose by YouTube. It's stupid they put restrictions on some heuristics to begin with but maybe because otherwise people would think YouTube is not loading properly while it's the software decoding on the not capable arm PC that can't handle the resolution.
Nope, my work Mac has 1080p\4K playback no problem.
Repeat after me kids. It's not an "oversight", or "mistake", or "bug", or "misunderstanding"...
IF
IT
KEEPS
HAPPENING
Seems like my Samsung TV app is being hit by stuff too, I had 5 unskippable ads and can't seem to get stable 1080p at 60fps any more despite gigabit fibre and cat6. Meanwhile getting 4k on my YouTube app on Android on WiFi.
Go figure.
YouTube is so desperate to fight this war that they're harming legitimate watchers meanwhile my rockpi running Android TV seems to keep running sTube just fine.
The nic on TVs tend to be awful. I can barely break 100mbps on my lg wired or wireless.
100mbps should be enough for a few 4K streams, and I imagine you’re not streaming more than one thing to your TV at any given time.
4k yes, 4k hdr is where it becomes limiting...from what I've read.
Perhaps, and I’ll readily admit my ignorance on this.
That said, I doubt the HDR overhead would be any larger than the equivalent baseline SDR content.
If my intuition is right, depending on other factors like compression you could still fit at least 2 streams on that bandwidth.
Enshittification intensifies!
Does this apply to Windows on ARM as well, or is it just Linux specifically for some reason?
It's a processor variable, not OS
That's what I figured, but every article I've seen on this calls out Linux specifically. I'll have to give it a try from my Surface Pro X when I get home and test.
Sounds like a raspberry thing