Ben Sandofsky created one of the most popular camera apps for iOS: Halide. It’s an app with a clean UI, RAW support, and depth detection features. It’s that latter feature that stood out to Sandofsky. As you can see in the photo above, the sharpness of Apple’s portrait mode has improved substantially between iOS 11 and iOS 12.
Usually the iPhone has trouble distinguishing between individual hairs or clear glasses. However, it seems as though Apple has dramatically improved the detail through software, not hardware. Google has managed depth detection from a single sensor on the Pixel 2, so Apple’s two sensors should be more than capable of beating Google.
iOS 12 will release this fall. It’s main improvements include a faster, more stabile operating system, grouped notifications, group FaceTime, and Screen Time to help you limit overuse of your phone.
Source: Killian Bell, Cult of Mac