this post was submitted on 12 May 2024
470 points (85.7% liked)

linuxmemes

21210 readers
43 users here now

Hint: :q!


Sister communities:


Community rules (click to expand)

1. Follow the site-wide rules

2. Be civil
  • Understand the difference between a joke and an insult.
  • Do not harrass or attack members of the community for any reason.
  • Leave remarks of "peasantry" to the PCMR community. If you dislike an OS/service/application, attack the thing you dislike, not the individuals who use it. Some people may not have a choice.
  • Bigotry will not be tolerated.
  • These rules are somewhat loosened when the subject is a public figure. Still, do not attack their person or incite harrassment.
  • 3. Post Linux-related content
  • Including Unix and BSD.
  • Non-Linux content is acceptable as long as it makes a reference to Linux. For example, the poorly made mockery of sudo in Windows.
  • No porn. Even if you watch it on a Linux machine.
  • 4. No recent reposts
  • Everybody uses Arch btw, can't quit Vim, and wants to interject for a moment. You can stop now.

  • Please report posts and comments that break these rules!

    founded 1 year ago
    MODERATORS
     
    you are viewing a single comment's thread
    view the rest of the comments
    [–] [email protected] 1 points 5 months ago

    Not necessarily. The base machines aren't that expensive, and this chip is also used in iPads. They support high resolution HDR output. The higher the number of monitors, resolution, bit depth, and refresh rate the more bandwidth is required for display output and the more complex and expensive the framebuffers are. Another system might support 3 or 4 monitors, but not support 5K output like the MacBooks do. I've seen Intel systems that struggled to even do a single 4K 60 FPS until I added another ram stick to make it dual channel. Apple do 5K output. Like sure they might technically support more monitors in theory, but in practice you will run into limitations if those monitors require too much bandwidth.

    Oh yeah and these systems also need to share bandwidth between the framebuffers, CPU, and GPU. It's no wonder they didn't put 3 or more very high resolution buffers into the lower end chips which have less bandwidth than the higher end ones. Even if it did work the performance impacts probably aren't worth it for a small number of users.