this post was submitted on 14 Jan 2024
190 points (96.6% liked)
Technology
59299 readers
4298 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I found people online saying it's because it's 24 frames (standard frame rate) higher than 120 meaning it can be used to watch movies using integer scaling (1:6 ratio of frame rate rather than 1:5.5 or something strange), take that with a massive grain of salt though because lots of people say there's other reasons.
If consuming media with integer scaling is the main concern, then 120Hz would be better than 144Hz, because it can be divided by 5 to make 24Hz (for movies) and divided by 2 or 4 to make 30/60Hz (for TV shows).
144Hz only cleanly divides into 24Hz by dividing it by 6. In order to get to 60Hz you need to divide by 2.4, which is not an integer.
And with either refresh rate 25/50Hz PAL content is still not dividable by a nice round integer value
Yeah as I said take what I said with a massive grain of salt, some people are saying it's because of a limit of hdmi data sending so it could be that.
Oh man those maths didn't click with me, of course it's just another 24 frames.
Me neither to be honest, 24 is kind of a weird number when added up.