"Deep Fusion" explained: First look at Apple's most innovative camera feature

May 2024 ยท 2 minute read
The new iPhone 11 family comes with new and better cameras, but there is one ground-breaking new camera feature that will not be available on these iPhones from the very start, but it will instead arrive as an update.

Apple calls this special feature "Deep Fusion" and it is a brand new way of taking pictures where the Neural Engine inside the Apple A13 chip uses machine learning to create the output image.

The result is a photo with a stunning amount of detail, of great dynamic range and with very low noise. The feature uses machine learning and works best in low to medium light.

Phil Schiller, Apple's chief camera enthusiast and also head of marketing, demonstrated the feature with a single teaser picture and explained how it works.

How "Deep Fusion" works:


It is truly the arrival of computational photography and Apple claims this is the "first time a neural engine is responsible for generating the output image". In a typical Apple fashion, it also laughingly calls this "computational photography mad science". Whichever definition you pick, we can't wait to see this new era of photography on the latest iPhones this fall (if you are wondering exactly when, there are no specifics, but practice shows it will likely be the end of October).

ncG1vNJzZmivp6x7sbTOp5yaqpWjrm%2BvzqZmp52nqHyFscSpZH%2Bto568r3nIiZ%2BoppVifnJ5wpqknqqRYrOmrdOuqZ5lla29ra3Ip5ydl5mZfnKEl25n