Camera hardware still matters when it comes to capturing the best smartphone photos

“Computational photography” has been the buzzword for smartphone cameras in the past few years, and increasingly, I am hearing from my peers in tech media say camera software is flat-out more important than hardware in mobile photography.

While I’m not here to dismiss the importance of software in mobile photography or even argue that software is less important, I am about to make the case that camera hardware still matters greatly in separating the truly great cameras from the merely very good.

Who started the trend anyway?

The phrase computational photography has been around before the smartphone era — it refers to capturing images digitally instead of using film — but it really became a buzz phrase in the past five years, around the time the Google Pixel started grabbing headlines for its camera prowess. Using techniques like multi-stack imaging (which Google dubbed HDR+) and real-time machine learning, early Pixel phones were able to capture photos with jaw-dropping dynamic range, practically generate light into scenes too dark for the human eyes to see, and convincing artificial bokeh compared to its peers.

Camera hardware still matters when it comes to capturing the best smartphone photos
Google’s photo samples showing the Pixel 3’s night photography (right) performance against, supposedly, the iPhone Xs (left)

However, we must not mistakenly think that Google invented computational photography on smartphones. Every single smartphone with a camera ever used some type of computational photography by definition, because they were producing digital images. In fact, back in 2015, XDA’s then-editor-in-chief had already pondered whether camera hardware or software was more important — at the time, the debate was Samsung’s software approach vs LG’s hardware pursuit.

Google, of course, would push that discussion into the mainstream a year later, and spearhead the argument that software was more important. And for a few years, that was probably true, as the first three Pixels were by consensus the best camera phones on the market.

Rise of machine learning, and a difference in philosophy

Although the first few Pixel phones didn’t sell many units relative to the iPhones and Samsung Galaxies of the world, the unanimous praise for the Pixel cameras caught everyone’s attention, and over the next few years, every brand’s phone launch began spending more time touting its computational photography prowess. It’s interesting going back to rewatch the iPhone 6 and 6S series launch events from 2014 and 2015 and see Apple exec Phil Schiller give just a few seconds worth of speech time on those phones’ software image processing. The time spent talking about the iPhone’s image processing would ramp up significantly by 2016 and 2017. In 2019’s iPhone 11 series launch, Schiller spent over five minutes waxing poetic about Apple’s “DeepFusion” computational photography technology.

Phil Schiller on stage during iPhone 11 launch event.

While Samsung and Chinese brands like Huawei, Xiaomi, and OnePlus also worked on, and marketed, its image processing software prowess, these brands were generally eager to chase new hardware than Apple or Google. Whether this was a pure coincidence or a statement about the differences between east and west culture is a debate for another day. But toward the end of the last decade, a narrative emerged: Asian brands are eager to chase flashy hardware like more cameras and more pixels. In contrast, western brands (Apple and Google) focused on optimizing the software experience and using machine learning to overcome any shortcomings in optics.

This divide in philosophy was perhaps most noticeable between 2018 and 2019 when Asian phone brands were introducing third and even fourth rear cameras and engaging in a megapixel arms race, while Google famously stuck with one camera in 2018’s Pixel 3. Both Google and Apple also stuck with 12MP cameras well into this decade, while Asian phone brands were flexing with 48MP, 64MP, and even 108MP cameras.

Samsung Galaxy S20 Ultra
The Samsung Galaxy S20 Ultra was one of the first phones to use a 108MP main camera and a Periscope zoom lens. 

The Pixel fell behind — until hardware grabbed the throne back

But here’s the thing, while the Pixel 1 and 2 were clearly the undisputed best camera on the market, by the Pixel 3 and 4’s launch in late 2018 and 2019, there was at least a solid debate to be had whether Asian rivals like the Huawei Mate 20 Pro and P30 Pro had the better camera. This was at a time when Huawei pursued optics with more pixels, and larger sensors along with launching the world’s first Periscope zoom lens (Oppo had teased the technology earlier, but Huawei beat it to mass production).

huawei p30 pro first impressions
The Huawei P30 Pro was in many tech reviewers’ opinions the best camera phone at the time of release, based on the strength of its large image sensor and Periscope zoom. 

By the time the Pixel 5 rolled around in 2020 (using the same main camera hardware as the Pixel 3 and 4), it became clear that using such outdated camera hardware was too much for even Google’s almighty software to overcome. Compared to the top 2020 flagship phones from Samsung, Huawei, and Xiaomi, Pixel 5’s photos were noisier in low light, less detailed when pixel peeping at 100% scale, and couldn’t zoom in nearly as far.

This means, that for all the wonders computational photography can do, you ultimately still need relevant camera hardware. Google would indeed bounce back and regain the camera throne in 2021 with the Pixel 6 series, and some of that had to do with the custom Tensor ISP. Perhaps the more important factor was Google significantly upgrading its camera hardware. The Pixel 6 phones adopted the 50MP GN1 sensor that had been used to great effect by Vivo, and the Pro model implemented the Periscope zoom technology pioneered by Oppo and Huawei.

Apple, like Google, also significantly upgraded the top-tier iPhones’ camera hardware in the last two years, including moving to a 48MP main camera.

You can keep pushing for hardware while not neglecting software

Xiaomi 12S Ultra

Asian brands aren’t letting up in terms of pushing the envelope with camera hardware. Last summer Xiaomi and Sony introduced a so-called “1-inch” camera sensor named the IMX989. While the sensor size does not really measure exactly 1 inch (the naming comes from old camera conventions), it is still 2.7x larger than the sensor used in the iPhone 13 Pro, which was the newest iPhone available for comparison at the time of the event.


A larger image sensor can take in more image information and produce a more realistic depth of field. The first phone with the IMX989, the Xiaomi 12S Ultra, could produce images that were clearly more detailed, with a stronger dynamic range, than any other phone out. I had the privilege of testing virtually every flagship phone released in 2022, and I named the 12S Ultra the best camera phone of 2022 based on the strength of that 1-inch sensor. I’m not alone in thinking this, as several tech reviewers, either known for their camera knowledge, such as Bloomberg’s Vlad Savov, or camera-centric tech media, such as PetaPixel and Digital Camera World, hailed it as the best camera phone they’ve used too.

The image sensor isn’t the only bit of hardware that is important. Using a superior set of lenses can also improve images — which shouldn’t be a surprise. Vivo’s last few flagship phones, for example, use a lens with Zeiss’ T-Coating, which noticeably reduces lens flare or harsh highlights compared to shots captured by other camera lenses.

Screenshot 2023-01-04 at 7.48.08 PM
The same scene, captured by the iPhone 13 Pro (left) and Vivo X80 Pro (right). 

Samsung and semiconductor company Omnivision, meanwhile, each introduced a 200MP image sensor that can do an insane 16-in-1 pixel binning. Reliable rumors say the upcoming Galaxy S23 Ultra will use such a pixel-dense sensor.

It would be inaccurate, however, to say Asian brands are merely blindly chasing specs. Sure, they are still pushing hardware boundaries, but the likes of Samsung, Xiaomi, Oppo, and Vivo have also invested millions working on computational photography via its own custom-built ISPs. That 200MP sensor, for example, will require a lot of computational photography to be able to render a 12.5MP image in real time. From Oppo’s MariSilicon X to Vivo’s V chips, these brands are not just chasing hardware for the sake of eye-popping specs, but focusing on building excellent software too.

Perhaps in another few years, the importance of software will take the lead again as physical limitations of the smartphone body will ultimately limit how much bigger camera components can get. However, as of right now, in 2023, camera hardware is still as important.

The Pixel 7 Pro is Google's top-of-the-line flagship of the year, featuring the second-gen Tensor SoC, a 120Hz LTPO display, a telephoto sensor, and a bigger battery.
Google Pixel 7 Pro

Google Pixel 7 Pro

The Pixel 7 Pro is Google’s best phone ever, with a refined, premium design and Google’s second-generation silicon — plus awesome cameras, as usual. 

An image showing the front and back side of the an Apple iPhone 14 Pro Max in Space Black color.

Apple iPhone 14 Pro Max

The iPhone 14 Pro Max is Apple’s biggest and best smartphone, and in typical Apple fashion, it is both a powerhouse and an endurance beast.

Leave a Reply