Smartphones
Apple iPhone 11 Pro rear camera
Apple

What is Apple Deep Fusion? New iPhone 11 and 11 Pro AI camera tech explained

Artificial intelligence-driven camera feature arrives with iOS 13.2 update

Like GearBrain on Facebook

When Apple announced the iPhone 11 and iPhone 11 Pro last month, the company spoke about a new camera feature called Deep Fusion, which would be launched after the phones went on sale.

Now that day has arrived and Deep Fusion photography is a part of the iOS 13.2 beta, which is available for developers and will be coming to everyone else very soon.

Read More:

Taking an artificial intelligence-driven approach to photography, Deep Fusion's aim is to reduce noise and graininess in low- and medium-light photos. It does this by taking multiple shots before you press the shutter button, then one after that. These images are then analyzed and merged together to create the best possible image, with the most detail and least amount of noise and grain.

Apple demonstrated Deep Fusion at the iPhone 11 launch by showing photos of knitted sweaters, then explaining how the AI system helps to boost sharpness in the woven fabric. Deep Fusion is intended to improve indoor photography, where light levels are often lower than ideal.

Apple iPhone 11 and 11 Pro available at Best Buy

Deep Fusion works automatically when the iPhone 11's processor thinks it is required. There is no indication of the system working, and even the metadata of photos taken does not show any evidence of Deep Fusion. The only tell is if you quickly open a photo you have just taken, then wait a second and, if Deep Fusion was used, the photo will flick from regular quality to Deep Fusion-enabled quality, as the processor finishes its work.

Apple says the standard lens of the iPhone 11 uses the regular Smart HDR feature most of the time. Deep Fusion then takes over in medium lighting, before Night Mode takes the reins in darker environments.

The telephoto lens of the iPhone 11 uses Deep Fusion for most photos, with Smart HDR only used in brightly-lit scenes. Finally, the ultrawide lens always uses Smart HDR and does not use Deep Fusion or Night Mode.

Apple iPhone 11 camera appDeep Fusion is automatically used in medium-light environmentsApple

What does Deep Fusion do?

Apple explained at the iPhone 11 reveal how Deep Fusion works, and how it starts taking photos before you press the shutter button. In fact, the system shoots eight frames before you tell it to take a photo — four at a higher shutter speed and four at regular speed.

Then, when you actually press the shutter button the iPhone 11 takes a single long-exposure shot. The system then scans each image — a total of 24 million pixels — picks out the short-exposure image with the most detail, then works to improve texture and detail, and reduce noise.

When working out how best to improve details, the iPhone uses artificial intelligence to understand the subject of the photograph.

The sky and low-detail objects like plain, flat walls are deemed of least importance, while skin, hair and fabrics (like those sweaters Apple is so proud of) are ranked more highly. Deep Fusion works to pull out as much detail from these areas as possible, rather than merely boosting detail in the shadows and highlights of the entire photo at once, as HDR does. Deep Fusion also takes tone, color and luminance into account.

Apple iPhone Deep Fusion photo exampleApple used this photo to demonstrate the detail-retention of Deep FusionApple

The final image is then produced and dropped into your camera roll. Apple says this entire process takes around one second, and while this is certainly fast for something so complex, it means Deep Fusion doesn't work when the camera is set to take bursts of images with the iPhone 11 and 11 Pro.

And, as we said earlier, the iPhone 11 and iOS 13 doesn't send an alert when Deep Fusion is being used. Apple wants the system to disappear into the background and only work when the processor deems it necessary.

All of this sounds mighty impressive, and it surely takes Apple's photography game forward from the iPhone XS of 2018. But the real test will be when Google announces its Pixel 4 on October 15. Google has built a reputation over recent years for being a market leader of computational photography, beating all other smartphones with software and a single lens.

The Pixel 4 is expected to have a second lens, plus what will no doubt be improved computational photography skills over the Pixel 3. All this means the iPhone 11 and Deep Fusion will have a serious fight on its hands later this month.

Google Pixel Smartphones available at Best Buy

Check out The GearBrain, our smart home compatibility checker to see the other compatible products that work with Google Assistant and Amazon Alexa enabled devices.

Like GearBrain on Facebook
The Conversation (0)

GearBrain Compatibility Find Engine

A pioneering recommendation platform where you can research, discover, buy, and learn how to connect and optimize smart devices.

Join our community! Ask and answer questions about smart devices and save yours in My Gear.

Top Stories

Weekly Deals