Reviews

I tested the Note 20 Ultra for weeks. What is going on with this camera?

I thought that Samsung had fixed the camera on the Note 20 Ultra, but the problem is Samsung's entire imaging philosophy.

Back in 2018 I went backpacking near Yosemite and the only camera I took with me was a Galaxy S9+ with a Moment Anamorphic lens.

I wouldn’t know it until I got back and had a chance to review the pictures, but those images ended up being some of the best I’ve ever taken, and I’ve done cross-country road trips with a DSLR and expensive lenses multiple times.

Galaxy S9+ in Pro mode with Moment Anamorphic. No edits.

It makes sense, then, that when the Galaxy S20 Ultra and its 108 megapixel camera were announced that I was practically frothing at the mouth to get my hands on it. Even now, when I look back at those Galaxy S9+ pictures, the only thing I could ask for is a bit more resolution, and back in March it really seemed like Samsung made the exact camera-focused phone that I was waiting for.

Galaxy S9+ in Pro mode with Moment Anamorphic. No edits.

Oh how wrong I would be. When Ray, our reviews editor, and I were testing the S20 Ultra we discovered the now infamous focus problem. That issue has basically been solved with software updates, but at the time it was bad. By the end of our first day of testing my excitement had completely collapsed. I ended up returning my S20 Ultra.

So when Samsung announced the Note 20 Ultra and its spiffy new laser autofocus, I figured I was all clear to throw down my hard-earned cash and put in a pre-order. What I didn't know was that underneath the autofocus problems lurked a bigger, more existential problem; Samsung's notorious image processing. And it didn't get better with more megapixels, it got much, much worse.

Galaxy S9+ in Pro mode with Moment Anamorphic. No edits.

Like I said before, I couldn't have been more pleased with the photos I got out of my Galaxy S9+. In the pictures you see above (and more that I will include throughout the piece), you can see that these images have a real mood to them. The colors are right on the money, even in RAW mode, and the exposure is basically perfect. It handled those bright summer days fanatically thanks to the Dual Aperture technology that Samsung perplexingly dropped.

Now, I acknowledge that photography is subjective, but there are two things I know to be true: The pictures I'm getting out of the Note 20 Ultra are in many ways worse than the ones I got out of the Galaxy S9+, and that no matter how much of a Samsung fan you are, surely you can agree that there is a line where images are so heavily processed that they don't look better, they just look worse.

I think that Samsung has crossed that line.

Not all HDR is created equal

Multi-frame HDR, if you're not familiar, is a technique where the camera takes several images at different exposures and combines them to create an image with more dynamic range than one image by itself would provide. This technique was popularized by the Google Pixel to great effect, but now almost every phone employs some form of multi-frame HDR.

Samsung's multi-frame HDR, as you might expect, is doing a lot. The Auto mode on the Note 20 Ultra takes a variable number of frames depending on the scene, whether you're using the 108 megapixel mode, and how much of the image processing buffer is left. But this algorithm isn't just lifting exposure, it's doing noise reduction and pushing clarity, texture, and sharpness to the max.

In the 12 megapixel mode, the pictures coming out of the Note 20 Ultra look way overprocessed, but it's not exactly clear makes them look that way. When you enable the 108 megapixel mode, however, that's when we get enough detail to make out what's really going on, and it isn't pretty.

Please click on the links below each image to see them at maximum zoom.

Why does it look like that?!See the full original file here.

Now, whatever this processing is that makes rock and moss look like they're made of worms, it's not artifacting from me enlarging the image above 100 percent, it's just what the image looks like at 100 percent. I've asked multiple Samsung employees about what it is that we're seeing here but have yet to get an answer that feels sufficient.

And sure, you could say that you're not "supposed" to pixel peep these images at 100 percent, but if an image looks bad at 100 percent, that will translate to some deleterious effects when you're zoomed out as well. Garbage in garbage out, as the saying goes. Also, zooming in on 108 megapixel images is literally what the mode is for. Here's what the marketing copy on Samsung's website says:

"Millions more pixels means exponentially more detailed photos. With 108MP on Galaxy Note20 Ultra 5G and 64MP on Galaxy Note20 5G, you can pinch in on the background and see it just as clearly as the whole picture."

Low-light or bust

The Note 20 Ultra on top, Sony Xperia 1 II on the bottom.

To say that Samsung, especially in its recent phones, is allergic to shadows would be an understatement. In fact, almost everything about the Note 20 Ultra camera is designed and engineered to make sure pictures are bright, even if it means tossing daytime picture quality out the window.

The Note 20 Ultra can't expose the subject and the rocks at the same time.

Two factors are at play. Obviously the multi-frame HDR algorithm is lifting shadows like crazy. But the design of the camera and the lens itself play a major role.

Remember when I mentioned the Dual Aperture technology Samsung included on the Galaxy S9 and S10 series of phones? Why do you think Samsung made that feature? Many reviewers believed Samsung when it said that the Dual Aperture was for "low-light" photography, but that's really only half of the story.

See, on the Galaxy S9 and S10, the maximum aperture was f/1.5, or a little bit brighter (when controlling for sensor size) than the Note 20 Ultra's fixed f/1.8 aperture. At f/1.5, the camera on the S9 and S10 could let in a lot of light, but it came at the expense of sharpness. If you read a review of a camera lens with a wide aperture, a common thing you'll see is that the lens is "a little soft wide-open" and that it "sharpens up when you stop it down." The same is true on phones.

Apparently Samsung decided that the f/1.8 lenses on the S20 and Note 20 series are sharp enough, and strike a good balance between sharpness and low-light photography. Considering how many of my Note 20 Ultra photos have blown-out highlights, like the picture above, I would tend to disagree.

A close-up photo from the Note 20 Ultra. No edits.

You see how this close-up shot of this little fungus looks blurry and kind of terrible? That probably wouldn't be the case if Samsung had included the Dual Aperture tech on the Note 20 Ultra. How do I know that? Well, when I was testing out the Moment Lens Tele at Engadget, I actually showed that this weird blurry effect is significantly reduced when you manually switch the aperture to f/2.4 on the Galaxy Note 9. I can't embed the comparison tool here, so you should visit the article, scroll half-way down until you see the teddy bear, and see for yourself.

So not only is Samsung's software doing the most to brighten photos, which often results in clipped highlights that then in turn exacerbate the multi-frame HDR artifacts shown in the section above, the hardware itself is hyper-optimized toward low-light photo and video. And yeah, the Note 20 Ultra and its large 1/1.3 inch sensor really are great at low-light photography (see below). I just wish that it was also great at daytime photography.

My lovely partner and a friend just after sunset. Note 20 Ultra, no edits.

And it was all yellow

One of the biggest bummers about shooting with the Note 20 Ultra is that the colors just look bad, especially when I look back at how great the colors were on the Galaxy S9+.

Galaxy S9+ in Pro mode. No edits.
Note 20 Ultra in Pro mode. No edits.

Both of the pictures above are RAW photos taken in Samsung's Pro mode and converted straight to JPEG. In my opinion, the colors in the Galaxy S9+ picture are about as perfect as you could ask for, especially for an image converted from a RAW. By comparison, the pictures coming out of the Note 20 Ultra look positively sickly.

In a nutshell, Note 20 Ultra pictures are very heavy in the yellow channel. This could be an auto white-balance problem, or it could be a side-effect of the nona-binning, or it could just be that Samsung hasn't gotten around to really tweaking the colors in the firmware. I can say that when I was shooting with Galaxy S9+ (late summer of 2018), it had already received several firmware updates, some which undoubtedly contained updates to the camera.

Of course, the camera in the Note 20 Ultra has been out since it made its debut in the S20 Ultra earlier this year, but who knows if Samsung brought the Note 20 Ultra's firmware up to parity with the S20 Ultra? That would make sense considering that they're basically the same phone, but we're talking about Samsung here, so anything is possible. If you're wondering, I shot all the pictures in this piece on the original Note 20 Ultra firmware that it came with out of the box.

No way out

Raymond Wong / Input.

This is Android we're talking about, so if you don't like what Samsung's camera app is doing, you can just use another one, right?

Haha. No.

Samsung's camera system is more locked down than Apple's, and, recently, Sony's. Third-party apps can't even see the (stellar) telephoto camera on the Note 20 Ultra, or any other Samsung phone for that matter. Why? Well, I've asked Samsung about this in a number of ways and have only gotten, frankly, utter nonsense back in terms of justifications.

Why does this matter? Well, Adobe recently poached the head of Google's Pixel imaging division, Marc Levoy, to make a universal camera app that does the kind of multi-frame HDR that made the Pixel famous. We don't know what kind of hardware support that app will require, but we do know that many people have modified and ported Google's camera app to other phones, often making huge improvements over stock camera apps, especially for low- and mid-range devices. There's only one catch: The camera needs to support shooting RAW.

Right now only the main camera on the Note 20 Ultra supports shooting RAW. In fact, Samsung actually removed the ability for the front-facing camera to shoot RAW on some devices in a software update, breaking GCam for a bunch of people.

Again, I pressed the company and was unable to get a sensible response, so the bottom line is that no, you can't sidestep Samsung's photo processing. You can't even shoot 108 megapixel images in the stock app's Pro mode, which doesn't do multi-frame HDR. Why? Who knows.

Going beyond lip service

It's a shame that Samsung only pays lip service to creators in its marketing. Apple, meanwhile, has turned its Shot on iPhone campaign into a highly successful, multi-year event that people love. Apple doesn't bend over backwards to empower the small number of creatives that use the iPhone as a professional camera, it just makes a high quality product with good APIs and the rest happens naturally.

But Samsung can fix a lot of these problems. Here's how it could elevate the Note 20 Ultra camera from the fumble that it is now to the photo and video marvel that it always should have been:

  • Provide RAW CameraAPI access to all three rear cameras and the front camera.
  • Provide some level of control over the aggression of the multi-frame HDR algorithm in Auto mode.
  • Whatever is causing the worm effect on natural textures, chill out on that.
  • Enable 108 megapixel shots in Pro Mode (even if that means disabling the option to shoot RAWs.)
  • Allow third-party apps to shoot 108 megapixel pictures.
  • Fix the colors in the stock camera app. Right now they're just bad.

That's it, that's literally all Samsung would have to do to make the Note 20 Ultra camera fantastic. Give us the choice, Samsung, and make your 2017 slogan of "Do what you can't" a reality.