Google’s Pixel 2 and Pixel 2 XL might have had teething issues when it comes to their displays, but one area where they didn’t fail to impress were their cameras. Following in the footsteps of the original Pixel duo that launched a year ago, the new 12.2-megapixel sensors in Google’s latest smartphones are a treat to use already, but their full potential isn’t quite exploited yet as there are many promised features yet to arrive, waiting to be enabled via future software updates.
Gadgets 360 had a Hangout session with Brian Rakaowski, VP of Product Management at Google and Timothy Knight, who leads camera development for Pixel 2, to talk specifically about the camera and what makes it tick. While it does a great job overall, it’s far from perfect, despite the record high DXO Mark rating. We’re all aware of some of the more publicised issues with the new Pixels like the audio issues when recording video and over Bluetooth, odd screen flashes, but we’ve had some issues with the camera too, which we hoped to get some clarity on from the Google duo, no puns intended.
Hardware Issues Make Google Pixel Phones Feel Like Expensive Beta Products
The Pixel 2 does a great job stabilising video but in low light, especially at 4K, the footage tends to get quite noisy. This is mainly because the Pixel 2 tries to brighten up the scene as much as possible by boosting the ISO, which gives you a brighter scene for sure, but at the cost of noise. This is done intentionally, Knight explains.
“That is a tradeoff we think a lot about. We tried to strike a balance of the two,» he says. «If you compared the Pixel 2 camera to other mobile cameras, you’ll see that we’re brighter. It’s easy to make the noise go away if you just make the image dark. We decided that we rather let the user see the scene more clearly, by making it brighter, even if that means there is some more noise.” Knight further says that 1080p video should be a bit less noisy compared to 4K, since there’s more headroom to do heavy weight processing, compared to 4K.
Another feature that’s missing in the Pixel 2 is 60fps support at 4K, something that the iPhone 8 Plus and iPhone X boast off. “4K at 60[fps], unfortunately, is not something we’re going to bring to Pixel 2,» says Knight. «For future products, we’ll consider it certainly. But for Pixel 2, 4K 30 and 1080 60 is the video we plan to support.” This limitation seems to have more to do with Qualcomm’s Snapdragon 835 chipset than anything else though.
If you’ve looked in the settings of the Pixel 2’s camera app, you’ll notice that enabling manual control for HDR+ gives you a second option in the viewfinder, called HDR+ enhanced. When we tested the Pixel 2 and the Pixel 2 XL, we didn’t really notice any quality difference between the two modes, other than the fact that it takes a longer time to process the HDR+ enhanced photo. Turns out, we were right.
“In the large majority of cases, there’s no difference. From a user perspective, HDR+ and HDR+ enhanced will take the same photograph,’ explains Knight. ‘In a small number of conditions, HDR+ enhanced can take a photograph that has a little more dynamic range.» The reason the enhanced mode takes longer to process is because in standard HDR+ mode, Zero Shutter Lag (ZSL) is on whereas in the extended mode, it’s off. Shutter lag is typically the time taken from the moment you press the shutter button, to when the picture is actually captured and saved. Zero Shutter Lag (ZSL) typically gives you near-instantaneous shots, with almost zero delay.
We initially assumed that the Pixel 2’s Visual Core imaging chip would help speed this process up, once it’s active in the Android 8.1 update, but that doesn’t seem to be the case. The Visual Core SoC’s primary purpose will be to enable third-party camera apps to use the HDR+ feature. “When third party’s use the camera API, they’ll be able to get the high quality processed pictures as a result,” says Rakaowski.
Finally, the lack of manual controls and RAW file supports is another bummer in new camera app. This is an area that other Android manufacturers like Samsung and HTC have really mastered over the years. Not everyone needs manual controls but it’s nice to have the option, especially when you want to take some artistic shots, and it’s very useful in low light. Having this feature would also help control the exposure in video, for those who prefer to capture the scene for what it is instead of brightening things up. However, Knight isn’t convinced that simply putting sliders for ISO, aperture, and so on is the best interface for a phone. He further states that in doing so, users will not be able to take advantage of HDR+, so image quality will suffer.
Google might add some level of manual control in the future, “but at the moment, don’t expect to see a manual slider anytime soon,” says Knight. It seems that Google is relying heavily on its machine learning to enhance photos and make them look so good as they do, which would explain why they aren’t willing to relinquish control over to the user. This applies to RAW file support too.
“We’ve received similar feedback from other users too [about RAW support]. We don’t have any updates today but we’re looking into it,” says Knight.