Google really proved its mastery of the image-processing software with its first Google Pixel smartphones. The Pixel 2 and Pixel 2 XL have had some serious hardware issues. But when it comes to the camera one can’t really complain. When everyone else needed a second camera sensor to create DSLR like blur effects, Google managed to do it with a single sensor. Once again, proving the might of its image processing software. The result is even better more often than other smartphones using dual sensors. But if you thought they were done until the next Pixel phones, think again. The Google Pixel 2 and its bigger sibling Pixel XL both include a custom imaging-focused SoC. Google calls it the Pixel Visual Core.
No, the visual core has nothing to with anything that the Pixel or Pixel 2 XL can do right now. This custom built eight-core processing unit that can run three trillion operations each second is currently dormant. The company hasn’t talked a lot about this visual core and maybe you’ve never heard of it either. It is only coming out now after the second Android 8.1 Developer Preview was released, enabling parts of the Pixel Visual Core. If you’ve used any of the four Pixel smartphones, you must’ve noticed a difference in camera quality when using third-party apps. If you needed that perfect HDR+ shot you couldn’t directly take it from the Instagram app. That’s because HAL HDR+ is a Google Camera feature.
Image source: Google research blog
Android 8.1 Developer Preview 2 for the Pixel 2 and Pixel 2 XL enables HAL HDR+ for third-party apps using the Pixel Visual Core. So you won’t notice any differences in image quality really when using the default camera which is the Google camera app. But it is a win-win for third-party apps developers and their users. Apps developers don’t even have to make any extra efforts in order to enable it. Any apps that plug into the standard Android Camera API will have its photos processed with the Pixel Visual Core. This gives them the HDR+ treatment much in the same way the Google camera app does, on its own. So with the Pixel Visual Core enabled, you can take your perfect HDR+ powered Snapchat pictures and show off to your friends.
For now, this is the only noticeable difference brought on due to the Pixel Visual Core. This is also probably the only purpose of enabling the visual core at this point. You may not notice a huge difference in the quality right away and that’s fine.
This is only a beta update and isn’t even utilizing the visual core to its fullest potential. Google is a software company so it might not surprise you that it also has also a machine learning component to it. This is Google’s first custom mobile SoC and it makes use of software like other companies use hardware. The software controls many more hardware details than it does on any other typical smartphone. This makes the software more complex. And hence, the need for machine learning.
We aren’t experts here but if Google’s track record is taken into account, there’s a good chance the visual core gets better with time.
Enable Pixel Visual Core on Pixel 2 and Pixel 2 XL
Provided that you are running the latest Android 8.1 beta on your Pixel 2/Pixel 2 XL, you can find the option to enable Pixel Visual Core in Developer options. Go to Settings and scroll down all the way to find the About phone entry. Tap on it and then repeatedly tap on the Build number seven times. You’ll also need to confirm your screen lock if you have one. A toast message will appear at the bottom when the Developer options have been enabled.
Go back to the main Settings page and you’ll find Developer options right above System upgrade. Scroll down and under the Debugging subsection, tap the toggle marked Camera HAL HDR+. Now all that is left to do is reboot your phone.