this post was submitted on 07 Feb 2024
59 points (100.0% liked)

Technology

37712 readers
154 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 21 points 9 months ago* (last edited 9 months ago) (2 children)

Analog cameras don't have the dynamic range of human vision, fall quite short in the gamut area, use various grain sizes, and can take vastly different photos depending on aperture shape (bokeh), F stop, shutter speed, particular lens, focal plane alignment, and so on.

More basically, human eyes can change focus and aperture when looking at different parts of a scene, which photos don't allow.

To take a "real photo", one would have to capture a HDR light field, then present it in a way an eye could focus and adjust to any point of it. There used to be a light field digital camera, but the resolution was horrible, and no HDR.

https://en.m.wikipedia.org/wiki/Light_field_camera

Everything else, is subject to more or less interpretation... and in particular phone cameras, have to correct for some crazy diffraction effects because of the tiny sensors they use.

[–] [email protected] 4 points 9 months ago (3 children)

It seems like Vision Pro allows selective focusing.

[–] [email protected] 11 points 9 months ago (1 children)

But then you'd have to use the Vision Pro...

[–] [email protected] 3 points 9 months ago

Wouldn't mind getting a second hand "like new" one with a scratched front ~~glass~~ plastic... for the right price, as long as the inner plastic lenses aren't scratched.

(I know, there's about no chance of that ever happening)

[–] [email protected] 4 points 9 months ago

But not on a static image. They use eye tracking to figure out what you're looking at and refocus the external cameras based on that.

[–] [email protected] 2 points 9 months ago* (last edited 9 months ago) (1 children)

It's actually a great idea - an up up-to-date light field camera combined with eye tracking to adjust focus. It could work right now in some VR, and presumably the same presentation without VR via a front-facing two-camera (maybe one camera with good calibration) smartphone array.

[–] [email protected] 2 points 9 months ago

Yup, I was seriously considering getting the Lytro, just to mess around. The main problem, is the resolution drop due to needing multiple sensor pixels per "image pixel", but then having to store them all anyway. So if you wanted a 10Mpx output image, you might need a 100Mpx sensor, and shuffle around 100Mpx... just for the result to look like 10Mpx.

If we aim at 4K (8Mpx) displays, it might still take some time for the sensors, and data processing capability on both ends to catch up. If we were to aim at something like an immersive 360 capture, it might take even longer. Adding HDR, and 60fps video recording, would push things way out of current hardware capabilities.