Leaf&Core

iPhone SE has iPhone 8 Camera, With a few Tricks

Reading Time: 4 minutes.

An iFixit teardown confirmed that the iPhone SE does not have the iPhone 11’s camera. It doesn’t have a new or unique camera either. Instead, it’s the same camera that was in the iPhone 8. This is special because this wasn’t even the same camera that was in the iPhone XR, yet seems to have abilities that the iPhone 8 camera lacked. The iPhone 8 camera wasn’t capable of the high quality HDR photos and depth effects on portraits that the iPhone SE and iPhone XR can do. The iPhone XR camera had hardware features to enable this, with the iPhone SE sensor lacks.

So what’s going on?

The iPhone SE is a perfect example of how computing can solve what photographic hardware alone cannot. The iPhone SE is a better camera than the iPhone 8 because it has a better brain.

iPhone SE Hardware

via iFixit

iFixit’s teardown of the iPhone SE found that it shared more than just the looks from the iPhone 8. A number of the components come directly from the iPhone 8. Including, it would seem, the camera module. The lens and sensors from the iPhone 8 live on in the iPhone SE. This means the camera in the iPhone SE is three years old now. Yet reviewers have stated it’s still a decent camera. Why’d Apple use old hardware for a new device?

Did you guess price? Yeah, it’s price. But how does this help keep the cost of the iPhone down?

This is due to Apple’s philosophy for the iPhone SE. Rather than create a new phone, they use existing components and a simplified manufacturing and repair process to ensure a low cost. Apple no longer has to make white faceplates for the 4.7-inch iPhone, because they’re all black. That cuts down on repair costs. They can re-use the manufacturing equipment from the iPhone 8. Many of the components are left over from the iPhone 8, and supply chains are well established. The processor comes from their existing iPhone 11 lineup. By pulling from their “parts bin,” Apple is able to make a low cost iPhone with premium parts.

Depth Effects and More

When I was in high school, I discovered a photography technique that has been around longer than I’m alive: solarization. I flashed a light on my print as it developed, causing the dark areas to lighten, getting a halo-like glow. It was the first time I realized that so many effects in photo editing software came from real world examples. Even physical photos could be manipulated. Slowly, digital caught up to film, replicating many of these capabilities. Now, computational photography has surpassed what traditional cameras were capable of.

It used to be, photography was nothing more than playing with light. Different lenses, different techniques, would paint your print with light. Now, however, our photos aren’t limited by the light reaching our sensors. They’re not even limited by reality, really. In fact, modern cell phone cameras are taking amazing photos, even when light doesn’t hit their sensors.

With AI, Apple’s able to add missing textures, identify subjects and scenes, and use multiple captures layered together in real time to produce a photo better than a single exposure of light hitting a sensor could ever be. It’s how night mode can produce such detail, despite the fact that the images it captured to create the shots didn’t have that detail. As a result, even an old camera, like the one in the iPhone 8, can perform better than it ever did in the past thanks to a new processor. And the iPhone SE has just that.

A13 Power

The hardware of the iPhone SE may be the iPhone 8 camera. However, thanks to the A13 processor, the photo that comes out of the iPhone SE isn’t the same as one coming from the iPhone 8. It’s using multiple exposures and AI to compile a photo that has detail that other cellphone cameras lack. This includes powerful HDR photography, revealing details that couldn’t be captured before. It also means doing 3D depth effects without the benefit of focus pixels, which the iPhone XR uses to produce depth of field effects with only one sensor.

In fact, you can test this. While the iPhone XR and iPhone SE can blur the background of photos with human subjects, only the iPhone SE can do so without actually looking at the subject. The iPhone SE uses software, and only software, to figure out what the subject of a photo is and blur the background. That means the iPhone SE can blur the background on a photo of a photo, while the iPhone XR will know that the photo is flat. This won’t improve your photos on the SE over the XR, in fact, it may hurt the accuracy of some of your portrait mode shots, but it’s interesting to see how the decreased sensor power in the SE doesn’t stop it from matching the photo quality of the iPhone XR.

Mind over Matter

Large sensors and lenses can take a great photo, but even your best DSLR needs multiple shots to create high dynamic range (HDR) photos. If you have a situation in which there are very dark areas and very bright areas, it’s better to take two photos, one with a longer exposure and one with a shorter one. Even the best camera hardware needs a little nudge to take the photos people have come to expect from their cameras.

The iPhone SE is a surprising piece of tech from Apple. It’s just $399, yet has the fastest mobile processor. While it’s not the best camera in Apple’s lineup, its powerful processor and clever software allow it to take impossibly good photos, especially when we consider this price point.


Sources:
Exit mobile version