The combination of updated camera hardware and excellent software should give the Google Pixel 6 series yet another leap in imaging.
Google’s Pixel 6 camera is finally getting competitive, and I couldn’t be more excited about
I firmly believe that cameras are the backbone of smartphone update cycles for most users. Even with mid-range hardware, performance indicators are no longer such a critical purchase decision. Images, on the other hand, offer the most visible improvement year after year. Since the introduction of the first Pixel, Google has put a laser focus on photography.
Ironically, while the popularity of its smartphones relies heavily on imaging capabilities, Google’s hardware development on the front of the camera has been surprisingly slow.
Did you know the Pixel series has used the same camera sensor since the Pixel 3 was launched in 2018? This sensor wasn’t much different from the previous Pixel 2. Or take the Pixel 5, for example, which eventually included an ultra-wide sensor but didn’t include table inserts like a Tele sensor, instead, Google insisted on using its software-based Super Res zoom technology which worked reasonably, but it couldn’t be compared to real optical zoom. Elsewhere, the year before, the company opted for a telephoto lens on the Pixel 4 but chose not to use an ultra-wide sensor, which definitely cannot be replicated with software.
Google’s strategy for pictures and smartphones, in general, was diametrically opposed to what almost all other OEMs in the Android space are pushing: specifications. You can step out of the pixel’s camera sensor and use the engineering mindset to create a consumer product. Apart from the fact that even Apple is choosing to use hardware solutions instead of reinventing the wheel.
Why Google has fallen behind the camera curve
Let’s start with the obvious: It’s clear that Google has pushed the IMX363 sensor to its limits. Our own tests have shown how far the Pixel 5 falls short of the competition. From the HDR noise to the zoom functions to the lacklustre ultra-wide camera, there are a few things that even the software can’t beat.
Excavator boss Marc Levoy could be blamed for this reluctance to change. In an interview on the introduction of the Pixel 5, Levoy stated that he was not convinced that pixel clustering and the resulting increase in the signal-to-noise ratio through a high-resolution sensor led to a noticeable improvement in the images. This may be true in 2019, but since then, the multitude of phones that have used these sensors to great effect has been proven wrong.
While few have matched the capabilities of Google’s software, sensor improvements have allowed its competitors to overcome many hardware limitations. Huawei pioneered the use of RYYB sensors that enable night vision capabilities, while Sony is expanding the expertise of its camera division at Enhance Color Science. Others like OnePlus have chosen to partner with traditional camera makers like Hasselblad to improve their game.
Elsewhere, the BBK group has invested heavily in imaging, and phones like the Oppo Find X3 have a variety of camera sensors to cover every possible use case. Xiaomi has also jumped into the ring and the Mi 11 Ultra is one of the best-equipped flagship cameras, not only because of the hardware but also because of its excellent camera setting.
Where Google led by a mile nationwide, it is now at best on par with the competition and behind in more ways than one.
A New Sensor Gives Google Software the Hardware it Needs to Shine
If Google’s thought process behind the Pixel series showed us one thing, it is that the company isn’t interested in competing for a quarter of a mile with time. You’d rather take big steps forward and perfect your hardware. With Levoy no longer in command, it seems that Google has recognized the flaw in its earlier way of thinking.
An upgraded camera sensor is exactly what the Pixel 6 series needs to improve their game, and that’s exactly what camera phones get around for. The software has maxed out the hardware, but we already know that Google’s imaging algorithms shine on high-end hardware.
Google’s camera app ports already exist for next-generation sensor phones, and the results are instructive. Because Google chose an updated sensor, already excellent software can maximize the benefits of years of hardware advances. Artificial intelligence, machine learning in the future with the tensor chipset will only increase this further. It goes well beyond a tangible but expected improvement in image quality, however.