Leaf&Core

Are Samsung’s Photos of the Moon Still Photos?

Reading Time: 6 minutes.
A photo from Samsung's generated moon image. Below, french text Ceci n'est pas la luna, This is not the moon.

Moon photo via ibreakphotos

I love Cyberpunk 2077. It might actually be my favorite game. I love how gorgeous the game is. It’s a fantastic recreation of the cyberpunk genre, bouncing back and forth between sublime beauty and grotesque brutality. There are times I’ve been driving down a highway, heading to my next gig, and had to quickly stop to admire the way light scatters between buildings and highlights a provocative ad just right.

Cyberpunk 2077 has a photo mode, like many games do these days. It lets you pause, move the camera around, play with exposure, filters, focus, and other “camera” settings, and take a screenshot. It feels like lining up a shot with a real camera. But it’s not. It’s just a rendering. They’re not “photos” from Night City, they’re screenshots of a game. And yet? Sometimes they look real enough to fool you. And setting up a photo and taking the perfect shot takes skill. But it’s not “photography,” not really.

Was it my photography skills or good developers that produced this image?

The cyberpunk genre has a darkly beautiful way of predicting the future. From corporations running governments to everything being for sale. A common theme is questioning your own reality. Deckard wonders if he’s a replicant, Neo wakes up in a war with machines, and V to questions whether or not the glitching graffiti is something wrong with their eye implants or the new cyberware in their head. Today, you can point a Samsung phone at the moon, zoom in, and see something as good as if not better than what you can see with your eyes. And it is not real. It’s an image processed through AI to generate what you’re seeing on the screen. So if you push the shutter button, is it a photograph, or a screenshot?

Space “Zoom”™

The Samsung Galaxy S23 Ultra has four cameras. There’s the standard 12-megapixel (MP) wide angle shooter, a 200MP ultra-wide angle camera, and two telephoto cameras, both 10MP. One of the pair of telephoto cameras is supposedly capable of “Super Resolution Zoom” up to 100x, it’s where Samsung’s “Space Zoom” feature comes into play. You’ll see it featured in Samsung’s marketing material, and it’s a great way to show up your friends with their sad iPhone photos. Is that a glowing orb? A glitch? Oh, the moon? Well check out this! Boom, the Samsung Galaxy Ultra S23 snaps a shot of the moon in seconds. It’s so detailed, it looks like you took a photo of the moon with a telescope and simply edited it into the frame.

No, that’s not what happened, not exactly, anyway. Though there were accusations that another smartphone manufacturer, Huawei, was simply replacing the moon in photos with a more detailed shot. Instead, Samsung is doing something a bit different. They’re using AI to detect the moon in the frame, keep it centered as you zoom in, and then use AI to tweak the image. The AI “knows” what the moon looks like. Because one side of the moon always faces us, with only a little wobble, AI can easily detect the moon. That also means the AI knows what the moon should look like. It can use this, in combination with the images the cameras take, to “pull” the image towards what it’s “supposed” to be. The phone can “know” that one area should be lighter, another darker. It can remove noise and enhance detail because it knows what the moon looks like.

Basically? It’s a clever AI-based beauty filter for the moon. You know how some beauty filters just look so real? The ones that actually process and tweak the image of a face, not just adding a makeup overlay, look uncanny. Your skin moves with the filter, it’s unreal. Those filters know what “beautiful” should look like: smooth skin, highlighted cheekbones, freckles, symmetry, and other superficial features. With that “in mind,” the AI can tweak your own image to look more like what you could look like, if you looked more like this standardized image of beauty. This is basically that, but for the moon.

I wonder if ol’ Luna will get as self-conscious about this as these filters make us?

How’d They Get Caught?

Samsung added details to a photo of a photo of the moon, proving camera trickery. Photo via u/ibreakphotos

I love a good experiment, and this was a fun one. A Redditor, u/ibreakphotos, set up a simple experiment. If you point your Samsung photo at the moon, you’ll get an amazing image of the moon. But what if you point it at something that only looks mostly like the moon? Well, the Samsung Galaxy S23 Ultra is smart, but it’s not that smart. It can’t tell the difference between the moon being blurred by motion and atmospheric conditions, and an image of the moon that has had details removed, blur applied, and displayed on a computer monitor. It’ll add details all the same. Those details may be present on the moon, but they’re definitely not present in the photo before adding them in.

User ibreakphotos was able to prove that the Samsung phone was adding details where there couldn’t be details. They removed the details from the photo. Despite the blur and washed highlights, the Samsung phone was able to generate details where there were none. That’s how they figured it out: the photos weren’t reality.

But does that mean they’re fake?

What’s A Photo?

Oh no, this one’s a real photo. No, it’s not, it’s a Horizon Forbidden West screenshot

 

I love Sci-Fi. It examines philosophical questions beyond our current technology. For better or for worse, it often comes true. The imaginations of today are the inventions of tomorrow. Today, we see AI adding details to the moon. But why stop there?

Have you ever seen the Flatiron building? It’s a lovely and unique building in New York City. Built at the sharp intersection of 5th Ave and Broadway, it has a unique triangular shape. I used to see it frequently when I worked nearby. Tourists come to NYC for a variety of reasons. The Empire State Building, Times Square, The Statue of Liberty. The Flatiron is a tourist spot. A few years ago it was under construction. I felt bad for the tourists. Sure, they were in my way because they, for some reason, don’t realize that it’s called a sidewalk, not a sidestand, but I still felt bad for them. They were taking photos of the building covered in an old NYC tradition: scaffolding.

I couldn’t help but think, “It’s a shame their vacation photos are ruined.” Many people can only take one vacation a year, if that, so it’s a shame their vacation would be marred by a bad view of such a landmark. But what if their photos didn’t have to be as ruined as their view was? I can open Apple Maps or Google Maps and plant my virtual butt right down on 5th Ave. Apple has 3D maps of it taken from satellite imagery and on the ground mapping. Google has clear photos of the detail on the sides of the building. They could easily find and buy the rights to photos of the Flatiron taken from every angle, in high quality. They could take that and map it out to a “perfect” 3D image of the Flatiron. Now, combine that with information from your camera lens, GPS, direction, time of day, altitude from the onboard barometer, accelerometer, and you can figure out exactly where the phone is pointing. Hell, you might not even need the lens. When you press the shutter button, would it be an image, or a screenshot of a generated 3D model?

You Haven’t Taken “Photos” in a While

You’ve actually been doing this for a while now. Well, something like it, anyway.

Samsung’s smart scenes aren’t just for the moon. Taking a photo of food? It’ll flatten the lighting and boost the saturation to make it look mouth watering. A beautiful landscape photo? You’ll swear the photo captured more color in the rocks of the Grand Canyon than you could see with your own eyeballs. Your iPhone will do similar processes, most devices do.

Those processes aren’t quite as extensive as Samsung’s little gimmick here. This is certainly a step too far from actual photography and computational photography and gets into what can only be described as fakery. These are generated images with details that are not present on the sensor, it’s definitely further than tweaks to color balance and exposure on a raw photo that has that data anyway. It’s also being done without a human touch, taking the art out of an art form.

But will it matter? Does it matter to the person who just wanted a picture of the moon behind some clouds and got exactly that?

What’s Reality, Anyway?

Nope, that’s not my keyboard. Not exactly, anyway.

Apple’s working on an AR headset. That’s augmented reality. Reality, but better. Heads-up displays, live directions, giant screen TVs, all projected on your eyeballs and not existing anywhere in space. But again I ask, why stop there?

Why not remove the scaffolding from the Flatiron building in real time? Why even show someone’s eyes that eyesore in the way of an architectural marvel? Too much light pollution to see Orion? Not like the stars are going anywhere quickly. Those starts have been mapped out and in the same rough location since humanity could first draw the constellations in the sand. Just show that. Show a photo of the Milky Way where it should be so no one has to see the gross reddish gray glow of the modern night sky. Can’t get to the front of the rock show? Zoom in. Better yet, see right through the people in front of you by tapping into a recreation made from cameras around the room.

Reality is what we make of it.

So maybe it’s not the moon. Maybe it’s not how the Flatiron looked on that day. Maybe the stars have long ago disappeared because we refuse to curb light pollution.

Will it matter?

Exit mobile version