The Moon can’t ever catch a break.
For over 50 years, our lunar rock has been burdened with one conspiracy theory after another. Despite the overwhelming amount of evidence that the U.S. did indeed put Neil Armstrong and Buzz Aldrin on the Moon, there still exist legions of non-believers who think the whole moon landing was a hoax set up by NASA and the U.S. government to beat the Soviet Union.
The Moon’s latest eye roll involves Samsung’s Galaxy S21 Ultra. In my review of the $1,200 phone, I heaped praise on Samsung for focusing on the tentpoles. I was most impressed by the S21 Ultra’s quad-camera system and especially the 100x “space zoom.”
I took the below photo of the Moon on January 19 with the S21 Ultra’s 100x zoom — default camera settings with the Scene Optimizer turned on and Zoom Lock enabled for stabilization — and was stunned by the details in the image. You can see the craters on the surface. The iPhone 12 Pro Max produced a blown-out orb, which I joked could have been a picture of a blurry potato or a clove of garlic.
The comparison photo took the internet by storm and further rejuvenated the ugly feud between iPhone and Android users. The stark difference in clarity seemed too good to be true for a smartphone camera. Naturally, some people called bullshit, providing sketchy “proof” that Samsung was faking the 100x Moon photos coming out of the S21 Ultra.
So began my long, unexpected investigation into whether or not Samsung was pulling a fast one on everyone with some clever behind-the-scenes AI.
First, let’s take a look at the accusation. Several readers responded to our comparison photos calling the S21 Ultra photo an “AI trick” of sorts. YouTuber/influencer Alexi Bexi cautioned us to not “fall for AI tricks.” Bexi then presented the below video as “evidence” that Samsung was slyly overlaying textures on top of 100x zoom photos of the moon.
“Easy to Show. Tried it just now. Here: Even if you take an unsharp/messy photo of something that looks like a moon it snaps a moon(ish) texture on it. :) simple as that,” wrote Bexi. “No magic hardware just clever software (combination), huawei did same year ago.”
I remember reading about Huawei’s so-called “fake moon” photos back in 2019, but had forgotten about the controversy. With Huawei, the company’s flagship P30 Pro’s “moon mode” was alleged to be adding non-existent details to images of the moon taken at 50x hybrid zoom. One tester, Wang Yue, accused Huawei of not only enhancing details of the Moon, but compositing craters and moon textures on top of the phone’s 50x shot. Yue’s conclusion was that the P30 Pro’s moon mode created a photo of the Moon with visible details, sure, but not a real one.
Many bloggers couldn’t replicate Yue’s testing. Huawei refuted the claims of image augmentation. It gave this statement to Android Authority:
Moon Mode operates on the same principle as other master AI modes, in that it recognizes and optimizes details within an image to help individuals take better photos. It does not in any way replace the image – that would require an unrealistic amount of storage space since AI mode recognizes over 1,300 scenarios. Based on machine learning principles, the camera recognizes a scenario and helps to optimize focus and exposure to enhance the details such as shapes, colors, and highlights/lowlights. This feature can be turned on or off easily while taking a photo. While there is a Moon Mode, the shot can still be taken without AI mode because of the periscope lens.
In other words: moon mode is merely using AI to enhance moon photos with detail and color sharpening. With no corroboration of Yue’s tests, it seemed the controversy was much ado about nothing.
Yet, the alleged false accusation still persists. Tech influencers like Bexi still believe Huawei phones fake their moon photos. Why? I think I know: because Huawei cheats. The company has been caught several times lying about the authenticity of photos that it claimed were snapped with its phones. In 2018, Huawei Egypt tried passing off selfies taken with a DSLR as shots from its Nova 3i. Then, in 2019, Huawei was again caught pretending sample photos taken with a DSLR were from its P30. Taken together, it’s no wonder that people don’t trust Huawei and, consequently, link Samsung to the same cheating.
Back to Bexi’s “evidence” of the S21 Ultra adding a texture to moon-shaped objects detected by the camera. I admit, the video immediately raised suspicion. So I tried to replicate it. I snapped some 100x photos of a clove of garlic on a black background. Nothing. My S21 Ultra review unit didn’t overlay any textures on top, despite the chunk of garlic looking like a moon. I opened up a photo of the Moon taken from Google Images on my computer monitor and then tried again. Nothing again.
I grew annoyed (but also somewhat relieved) that I couldn’t reproduce the test in the video. So I enlisted help from my friend Michael Fisher who you might know as MrMobile. Unbeknownst to me, it turns out he had taken an almost identical photo of the Moon with the S21 Ultra’s 100x zoom a day before me. The top reply to his tweeted photos was from YouTuber Danny Winget who suggested the photos weren’t genuine. “Take this same shot without scene detection ,” he wrote. I asked Fisher if he thought our Moon photos were secretly ‘shopped during image processing after we had pressed the shutter button.
He said he asked Samsung and was told there was no image manipulation beyond using AI to enhance details. “Given everything I've seen and heard, I don't suspect Samsung of doing anything more than enhancing native image components that are already there.” I explained my suspicion and shared the seemingly damning video. So Fisher tried to trick the S21 Ultra’s camera, too.
He placed a ping pong ball on a black blanket, but couldn’t fool the S21 Ultra’s camera into adding any textures. Even adding more luminance by setting it on top of a Galaxy Z Fold 2’s LED flash didn’t work.
We pored over why it wasn’t working. Perhaps, the S21 Ultra was more intelligent in the background than we thought. Could it be possible it was measuring the angle of the camera (aimed at the sky) and/or using location data via GPS coordinates to identify that a user actually has their phone aimed at the Moon? It seemed unlikely, but other tech YouTubers I spoke to like Dave Lee aka Dave2D, didn’t rule out the possibility.
What exactly was going on? Is it widely accepted that the photos of the Moon taken at 100x on the S21 Ultra are “fake”?
Real or fake?
“Yeah it’s a AI overlay I believe like Huawei did in the past,” Winget told me when I asked him for his thoughts on my and Fisher’s Moon photos. “If it was real it would work without scene detection. Did people actually think this is real though?”
The score so far: Real: 1 vs. Fake: 2
I asked a few more tech and camera-obsessed reviewer friends. Nobody seemed certain as to whether the Moon photos were real or not.
Android Police’s Max Weinbach, who you may know for his copious leaks on Samsung devices, didn’t think Samsung was pulling a Huawei. He shared a more technical reason why: he couldn’t find any incriminating evidence suggesting overlaid textures within the APK for Samsung’s camera app. If Samsung is really compositing textures onto moon-like shapes/objects as skeptics suggest, the files for them would be in the camera APK.
“In the Scene Optimizer libs (libraries), I found this "_ZN15superResolution7arcsoft2v125ArcSuperResolutionWrapper13SetMoonWeightEPvi," Weinbach told me. “It's ArcSoft SuperResolution for the Moon. That's about all I can find. I have a feeling it's a ML (machine learning) algorithm trained specifically for the Moon, which is probably a less intentional version of what Huawei did. From what I see in the APK, they don't have the backend for it. I can't find any hard coded overlays.”
The score: Real: 2 vs. Fake: 2
Dave Lee believed the Moon pics to be fake or at the very least somewhat optically enhanced beyond reality. “My guess is..... it’s fake.”
“I think if it is fake, they’re doing some higher level stuff than Huawei, “ Lee said. “Optically speaking, it’s VERY unlikely they’re pulling it off without some trickery going on. Thing is, there’s a ton of software happening on that shot already. Even without a moon. So it wouldn’t surprise me if they just added in the moon stuff.”
Lee touched on an interesting point later in our conversation as the conspiracy theory got more absurd. “I doubt it would be just an overlay (if it actually is fake). My guess is that it has a map of the Moon in the software. It knows all the craters and mountains. So if you snap a pic, it knows how to... accentuate features it can see in your actual photo. And accentuate it in a way that makes it much cleaner but at the same time keeps it ‘real’ to what you actually shot on the cam. (This is entirely a guess.)”
His hypothesis didn’t line up with Weinbach’s digital sleuthing. Weinbach confirmed to me there are no moon maps of textures or craters inside of the Samsung Camera’s APK. Any such AI-based, machine-learned enhancements would likely be performed by the S21 Ultra’s Super Resolution algorithms.
New score: Real: 2 vs. Fake: 3
It wasn’t looking good for Samsung. I turned to veteran tech expert and YouTuber Brian Tong to see which direction the score would swing toward. “I don’t think the S21 Ultra is making up textures that it adds on top of the Moon. Adding textures seems way too aggressive. I could understand AI maybe filling in the blanks to a certain degree.”
Final score: Real: 3 vs. Fake: 3
Not satisfied with the tie, I realized there was only one way to get closer to the truth: Shoot the Moon with a mirrorless camera and zoom lens and then compare it with a shot taken with the S21 Ultra’s 100x zoom. If the craters and positioning all line up, then Samsung’s name would be cleared. If they didn’t, then Samsung would have some serious explaining and another potential PR disaster on its hands.
Sony A7R III vs. S21 Ultra
New York City weather has not been ideal recently. It snowed earlier in the week and the sky has been so cloudy it’s been impossible to see the Moon. But when it finally cleared up, I busted out the full-frame Sony A7R III and $2,000 200-600mm rental lens to take the lunar rock’s portrait.
Here are the two Moon shots, both shot with auto settings for a level playing field. The Sony camera image was cropped to exact proportions as the S21 Ultra shot. While my objective was not to compare image quality between the $4,800 Sony camera and lens versus the $1,200 S21 Ultra, I was shocked by the images. I expected the Sony camera to destroy the S21 Ultra on account it has a significantly larger full-frame sensor that takes 42.4-megapixel images. Coupled with a 600mm focal length, I didn’t think the S21 Ultra stood a chance. Boy, was I wrong.
Even mounted to a tripod with in-body stabilization and Optical SteadyShot stabilization built into the telephoto lens, I noticed the A7R III struggled to nail a tack-sharp photo; fighting against the frigid night, the camera would lock autofocus but then often lose it the split-second I pressed the shutter button. I later switched to manual focusing, but the sharpness was even worse than auto.
Also surprising: I was able to get better photos with the Moon in the center of the frame shooting with the S21 Ultra handheld than with a tripod. When mounted to a tripod, the S21 Ultra’s lens would keep drifting in all directions, which made it more difficult to keep the Moon framed in the center. Earlier, I said I shot with the S21 Ultra on automatic. That means with the default settings that come out of the box: 12-megapixel nona-binned JPEGs, automatic night mode, Scene Optimizer turned on, and Zoom Lock (for locking focus on the Moon at 100x zoom).
White balance difference aside, the S21 Ultra’s Moon photo is sharper with more detail in the craters. In comparison, the A7R III photo is disappointingly soft straight out of the camera. I imported the image into Lightroom and brought back some of the highlights, boosted sharpness, and dialed up the texture and clarity just a little bit to give the craters more definition to closer match the S21 Ultra’s Moon image. Here’s what that edited photo taken with the A7R III looks like and how it compares to the S21 Ultra:
Overlaid on top of one another, the surface of the Moon matches up nearly perfectly in both photos. If the S21 Ultra was compositing an image of the Moon’s surface taken from some kind of database of Moon maps stored within the software, it’d be very hard to get them to match perfectly. The phone would have to be collecting extremely precise measurements to get the angle of the craters just perfect. That seems like an awful lot of work to go through just to fake moon photos and not even Samsung would bother wasting resources (and phone processing power) just to sell people on a 100x zoom.
AI is the secret weapon
Now that we’ve established that the Moon photos from the S21 Ultra are most definitely not fake, how is Samsung pulling off the seemingly impossible? How is the S21 Ultra’s 100x zoom taking a photo that bests even a $4,800 camera setup? Simple: AI.
Samsung hasn’t hidden this fact, either. In an in-depth look at the S21 Ultra’s camera technologies Samsung says AI Super Resolution is responsible for producing sharper-than-the-naked-eye-can-see photos at 10x to 100x zoom.
From Samsung (emphasis ours):
From 10x to 100x zoom, image quality is boosted by powerful Super Resolution AI. At one push of the shutter, up to 20 frames are captured and processed at instantaneous speeds. Advanced AI then evaluates and corrects thousands of fine details to produce detailed images even at high magnification levels. And when shooting at high magnifications, Zoom Lock uses intelligent software to set the image in place so you can shoot with minimal shake.
Skeptics might read this and say, Samsung openly admits to using AI to enhance a photo. Yes, they do, but enhancement AI (i.e. sharpening details, correcting and/or improving exposure, and adjusting color, etc.) is no different than the computational photography that is available on other phones from the iPhone to the Pixel. Good HDR photography on smartphones would not be possible without AI image processing and all respectable camera phones now use it to produce their photos.
The key difference between a fake photo and an enhanced or adjusted photo is the addition of separate imagery. Using an algorithm to edit a photo for clarity to compensate for the deficiencies of a phone’s image sensor size and limited optics isn’t the same as producing a new image with details from another photo taken from elsewhere. There's a major difference between correction versus addition. At least, that’s where I draw the line. Other photographers and creators may feel differently.
For completeness, I asked Samsung for clarity on capturing the Moon with the S21 Ultra’s 100x zoom and after several rounds of back and forth got some very good technical answers, which I’m including in full below.
First, I asked Samsung why a detailed photo of the Moon could only be taken with the Scene Optimizer turned on. If you turn it off, you get an overexposed white ball that looks like this instead of the above shots:
Here’s what a Samsung spokesperson told me:
The yellow moon icon is from Scene Optimizer to signal that it identified a nighttime scene. In dark/low lighting environment – users may also get a pop-up suggesting to use Night Mode. Once the camera detects an object (in this case the Moon) and its brightness level, it will then control the exposure level according to the Moon’s brightness (which can be very intense). Under this scenario, the camera will use deep AI leading algorithm to enhance texture details and restoration. Based on this, no photo substituting is performed.
Without Scene Optimizer turned on, the S21 Ultra can’t identify the object as the Moon and run its AI algorithms to tweak camera settings for a proper exposure. You can think of the AI as a custom moon preset mode that adjusts the camera’s exposure compensation, shutter speed, ISO — all of these settings, only instead of through hardware it’s done with machine learning — for you to get a clean Moon photo.
I grilled Samsung for more information on the AI Super Resolution algorithm. Walk everyone through the steps of the image processing. When I pressed about Weinbach’s discovery that there’s code in the camera APK referencing ArcSoft and the Moon, Samsung declined to confirm specifics involving the third-party software.
Scene Optimizer was developed in-house and is Samsung's own proprietary solution. During the development process, we collaborated with several 3rd party software companies, particularly for AI big data and deep learning modules. Scene Optimizer, when first introduced with Galaxy Note9 could recognize 20 scenes. Since then it has evolved to recognize more diverse scenes, as well as a more sophisticated optimization process. Galaxy S21 series offer more than 30 scenes for Scene Optimizer.
ArcSoft, for what it’s worth, lists Samsung (among others like Qualcomm and MediaTek) as a licensor of its image and video solution for phones.
I asked Samsung to specifically explain what was happening in Bexi’s video where he was able to see an alleged texture applied onto a moon-like object. Here’s Samsung’s detailed breakdown of what’s likely happening in that video (emphasis ours):
When taking a photo with the Galaxy S21 cameras and Scene Optimizer is activated, once AI recognizes the object/scene it will work through every step of processing. AI will first start by detecting the scene/image at the preview stage by testing it from an AI model trained on hundreds of thousands images. Once the camera detects and identifies the image as a certain scene, for example, the Moon, then offers a detail enhancing function by reducing blurs and noises. Additionally in low light/high zoom situations, our Super Resolution processing is happening (I.e., multi-frames/multi-exposures are captured > A reference frame is selected > Alignment and Registration of multi-frame/multi-exposures > Solution Output). The actual photo will typically be higher quality than the camera preview. This is due to the additional AI-based multi-image processing that occurs as the photo is captured.
For example, when taking photos of an object, 3 key elements are taken into place. Object detection (when scene optimizer is enabled), powerful AI processing and multiple frames enhancement. Each one of these features plays a critical role in order to deliver quality photos. When combined, these features generate the proper balance between a natural look and detail. The process starts by identifying an object based on a realistic human eye view, then multi-frame fusion and upscaling adds on by generating a higher level of detail to the subject, finally leveraged by AI deep learning solution it uses contextual assumption to process and piece together all the information to delivering a high quality result.
No image overlaying or texture effects are applied when taking a photo, because that would cause similar objects to share the same texture patterns if an object detection were to be confused by the Scene Optimizer.
We know some users will want to capture images without AI involvement which is why disabling Scene Optimizer is a simple, convenient option. Simply press the icon to disable.
And just to reiterate there’s no image overlays at any point during image processing (before, during, and after a shot is taken), Samsung told me “the Scene Optimizer process is not overlaying any textures or adding fake images.”
3,500-something words into this investigation and I feel confident between my own comparison, the lack of Moon overlay photos or maps within the camera’s software, and Samsung’s own detailed explanations that there is no faking going on with 100x Moon photos. The S21 Ultra's doing a ton of correction on a 100x photo of the Moon and I have no reason to believe any addition of third-party imagery is happening. The S21 Ultra’s 100x zoom (with intelligent software tuning) is really that impressive and gives it a considerable edge over other phones.
But I also want to include one caveat: the S21 Ultra’s Scene Optimizer will not suddenly make all 100x zoom photos look as crispy as the Moon. Samsung flat-out says the Scene Optimizer can recognize “more than 30 scenes.” That includes the following according to a spokesperson:
Food, Portraits, Flowers, Indoor scenes, Animals, Landscapes, Greenery, Trees, Sky, Mountains, Beaches, Sunrises and sunsets, Watersides, Street scenes, Night scenes, Waterfalls, Snow, Birds, Backlit, Text, Clothing, Vehicle, Shoe, Dog, Face, Drink, Stage, Baby, People, Cat, Moon.
Scenes and objects that aren’t recognized by the Scene Optimizer will likely look like grainy mush at 100x zoom. So take that into consideration when using the S21 Ultra’s max zoom.
Honestly, I can’t believe I spent this many words debunking such a silly conspiracy theory. But consider the case closed (for now). Now, if you’ll excuse me, I have to return to my regular scheduled programming that consists of dunking on flat earthers and people who believe in UFOs.