Michael Tsai – Blog – HDR in Forthcoming Halide 3


Ben Sandofsky (tweet, Hacker News):

Then the algorithms combined everything into a single “photo” that matches human vision… a photo that was useless, since computer screens couldn’t display HDR. So these researchers also came up with algorithms to squeeze HDR values onto an SDR screen, which they called “Tone Mapping.”

[…]

HDR and Deep Fusion require that the iPhone camera capture a burst of photos and stitch them together to preserve the best parts. Sometimes it goofs. Even when the algorithms behave, they come with tradeoffs.

[…]

After considerable research, experimentation, trial and error, we’ve arrived on a tone mapper that feels true to the dodging and burning of analog photography. What makes it unique? For starters, it’s derived from a single capture, as opposed to the multi-exposure approaches that sacrifice detail. While a single capture can’t reach the dynamic range of human vision, good sensors have dynamic range approaching film.

However, the best feature is that this tone mapping is off by default. If you come across a photo that feels like it could use a little highlight or shadow recovery, you can now hop into Halide’s updated Image Lab.

I’m still disappointed by the options that recent versions of the iOS Camera app offer here. With iPhone 13 and later, there seems to be no way to turn off Smart HDR. Enabling ProRAW may give more control for post-processing in Lightroom, but it still combines multiple captures into one.

I like that Lux is trying to address this with Halide, but I’m not sure this is the right solution for me. First, is the sensor really good enough to get all the needed data with one capture? Second, I don’t want to go through all the photos on my iPhone; I prefer to process them on my Mac.

Previously:


Comments

We will be happy to hear your thoughts

Leave a reply

Som2ny Network
Logo
Compare items
  • Total (0)
Compare
0