What is Smart HDR and how do I use it? (updated)
Smart HDR is one of the most exciting new features in the all-new iPhone XS, iPhone XS Max and iPhone XR, because it helps you take much better images using your smartphone.
What is Smart HDR?
At its simplest, Smart HDR helps you capture images with better colours, shadows and highlights. It’s a progression from HDR – some are calling it “HDR on steroids”.
On an iPhone HDR is the technology that takes a series of images when you grab a picture and then splices them together to create a better shot.
How does Smart HDR work?
Smart HDR takes this further because it shoots a 4-frame buffer in between each image, and also captures secondary images at different exposures. It then combines all these images to create an optimized image, which means that all the colors, shadows and highlights in the image will (usually) look a whole lot better.
Yes, but how does it work?
To achieve these results Smart HDR uses a lot of machine intelligence, in this case harnessing the power of the Neural Engine and the powerful A12 Bionic chip Apple has packed inside of its iPhone. This AI is also what makes it possible to adjust the depth of field in both real-time preview and post-capture to create striking portraits with a beautiful bokeh effect.
The #iPhoneXR – the slider option in the pictures app enables you to adjust the depth of field 📷👌
Awesome.#AppleEvent pic.twitter.com/imiz8nP55V
— Marc Allera (@MarcAllera) September 12, 2018
The above Tweet shows another powerful feature — adjustable depth of field.
What happens when you capture an image (actually a sequence of frames that are synthesized to become an image) is that the image signalling processor works with the AI to figure out what you are trying to photograph and then to deliver the best possible image as a result.
Perhaps an example might help…
Apple tried to explain the feature during its keynote. When shooting a moving subject, Smart HDR causes the camera to constantly shoot a buffer in order to reduce the lag once you click the shutter. (Like Live Photos). The tech is also capturing additional frames in between the four-frame buffer shots. These are captured at different exposures, which helps your smart little iPhone figure out things like shadows, highlights, background objects.
That’s when the magic happens. Apple’s Neural Engine takes all these components – the image you caught, the buffer frames and all those inter-frame shots and brings them all together, choosing the best parts of each component to synthesize into the whole. You get to take amazingly high-quality pictures without needing to think too much about it. Like these:
LOOK: Photos taken with the new Smart HDR. #AppleEvent pic.twitter.com/5nm8w6YDmA
— Rappler (@rapplerdotcom) September 12, 2018
What does this mean?
In Apple’s world, 12-megapixels is plenty enough so long as you have the software, machine intelligence and technologies to optimize the results. Why would anyone want to take a lousy image on a bigger CCD? Doing so is a waste of precious storage space, after all?
[amazon_link asins=’1454707623′ template=’ProductCarousel’ store=’9to5ma-20′ marketplace=’US’ link_id=’c1641138-b774-11e8-bff2-2fdd0be340d9′]
How do I use it?
It’s simple. Just make sure HDR is enabled in the on-screen menu, choose your subject, and take a picture. Your iPhone does the heavy lifting.
Announcing the feature, Apple VP marketing, Phil Schiller said: “Increasingly, what makes incredible photos possible aren’t just the sensor and the lens but the chip and the software that runs on it.”
Shot on iPhone
Proof of the pudding is in the eating, as they say. In this case Apple is putting the word out that once you get hold of one of its new iPhones you’ll be able to share them with the rest of the world at this Instagram feed: “Have a picture the world should see? Upload it to Instagram with the hashtag #ShotoniPhone.”
https://youtu.be/xL8piHkl3X8