Filmmaker says iPhone XS sensor and image processing makes low-light video ‘voodoo’ good

A filmmaker and colorist who put the iPhone XS low-light video capabilities to the test says that the results are so good that it may be better to simply let the camera app do its own thing than to use manual controls in something like FiLMiC Pro …


Richard Lackey said that the camera improvements should not be underestimated.

[Apple,] you’ve got my head spinning trying to figure out what kind of voodoo you’re pulling off in this phone […] The iPhone XS Max, XS and XR camera system seems to generate images that are beyond the sum of its parts.

While this may have “just” been a “S” year, the significance and impact of Apple’s clear direction towards sophisticated real time computational image processing should not be underestimated […]

With the iPhone XS Max I was able to capture clean video in very low light conditions. Not only was it clean, it had more color information in dark parts of the image than any previous generation iPhone I have shot with.

Low-light image-processing

He said that the dynamic tone mapping employed appears to be particularly sophisticated.

Judging from the behavior of the camera in bright and normal lighting conditions, it appears that the luminance values recorded are not purely determined by a fixed gain value (ISO), or fixed gamma transform as would be the case with a “traditional” (I’ll call it “dumb”) camera.

Something else is at play, and it is dynamic, changing according to some combination of variables linked to a real time analysis of the scene. It may even be making separate localised adjustments to different parts of the image, which would be extremely impressive if true.

The noise-reduction likewise. He wrote that he can normally spot artefacts resulting from this, but he couldn’t in this case.

Whatever combination of spatial and temporal analysis is at work, it’s very very good, and probably applied quite early in the signal chain. I don’t think it’s being applied globally to the whole image either. It seems like it could be localised to just the areas of the image that will benefit […]

What surprises me the most is areas of the image in low light that I would normally expect to see a lack of detail and texture, have decent detail and texture. Go figure. Not quite sure how they are pulling this off.

The bottom-line

But most tellingly of all, a professional filmmaker actually thinks that the camera might make a better job of choosing the settings than he could do himself.

I don’t see much reason to shoot with the flat or log gamma profiles. In fact I think the best results could very well come from letting the iPhone do it’s own thing. It may be doing a better job of maximizing recorded dynamic range through it’s own intelligent tone mapping than can be achieved manually.

If I didn’t still use a standalone camera for any planned shooting, this might be enough to change my mind about not upgrading to the iPhone XS. It certainly seems that the improvements are more dramatic than they first appeared.

Check out the video below to judge for yourself. It begins with a graded version, followed by the straight-from-camera footage.

You can follow on Twitter, add us to your circle on Google+ or like our Facebook page to keep yourself updated on all the latest from Apple and the Web.