Google Pixel’s HDR+ trick makes almost any Android camera better

Essential-vs-Pixel-App-Before-after-796x419.jpg

The Google Pixel is often considered to have the very best camera around thanks to some truly effective software processing. Now you can get that same processing on almost any recent Android flagship.

While you’ve been able to sideload Google’s Camera app for some time, the actual processing that gives the Pixel its beautiful photos is still exclusive. Called HDR+, the technology not only increases dynamic range, like traditional HDR modes, but reduces noise and improve colors as well.

In very oversimplified terms, the camera basically takes an extremely fast burst of shots, and then combines the best bit of each image for the final exposure. I’ve yet to try a camera that matches the Pixel’s consistency in rendering accurate dynamic range, realistic night-time scenes, and overall vibrant colors.

EOJ2ol1l.jpg

Scene with the standard essential camera on the left, and then again with Google’s app on the same phone.

As first reported by XDA, an enterprising developer named B-S-G has ported the Pixel’s camera software to work on virtually any phone running a Snapdragon 820, 821, or 835. That means the vast majority of flagship devices released in the US since 2016.

Keep in mind you’ll have to enable apps from unknown sources in order to install the APK. While XDA says its tested the file to make sure it’s safe, you’re still installing at your own risk.

LG-V30-vs-Google-nighttime.jpg

LG V30 Stock vs Google Camera. Notice the mode detailed shadows and more realistic colors.

On the phones I tested, the improvements ranged from minor to shocking, and, more importantly, Google’s camera was rarely worse than the stock app. And I’ve tried quite a few phones, though I spent most of the time testing with Samsung’s S8, a pre-production LG V30, and the Essential Phone. I’ve also tested HTC’s U11 a bit, which has the same sensor as the Pixel, but adds optical image stabilization and a faster aperture. The HTC U11’s better hardware puts it on par with the Pixel in most cases, but using the Google’s own Camera app makes it even better.

5QbunKwl.jpg

HTC U11 vs Google Camera app. The Google Camera image maintains the blue sky, and is punchier overall, even if the shadows are a little more crushed.

One of the most obvious signs a photo was taken with a bad camera are blown out skies. Google’s processing managed to avoid doing so better than any other camera app, even thought it’s not optimized for any of their sensors. While your mileage may vary – perhaps Google’s processing isn’t quite to your taste – it means skies were consistently blue, clouds had the texture they’re supposed to, night-time scenes had accurate color, and noise reduction was tastefully balanced with detail.

Here’s a gallery along with some descriptions for context. In every image, the Google Camera app is on the right:

QKhLgK8l.jpg

LG V30 Stock Camera vs Google Camera, both with HDR on.

HTC-U11-vs-Pixel.jpg

HTC U11 default vs with Google Camera app. The Google Camera image maintains the blue sky, and is punchier overall, even if the shadows are a little more crushed.

Essential-vs-Pixel-App.jpg

Essential Phone Stock Camera vs Google Camera, both with HDR on. The latter is more vibrant, realistic, and has much better highlight retention.

mWuZOgWl.jpg

LG V30 Stock Camera vs Google Camera. HDR Auto on with the V30 camera, HDR auto on the Google one. Again, Google's software is amazing at recovering highlights while maintaining shadow detail.

LG-V30-vs-Google-nighttime.jpg

LG V30 Stock Camera vs Google Camera. Note the much more vibrant (and accurate) colors using Google's camera.

uR2JjT2l.jpg

Samsung S8 Stock Camera vs Google Camera, both with HDR on.

1Qnwyc5l.jpg

Samsung S8 Stock Camera vs Google Camera, Both on Auto settings. Similar results, but the sky is a bit more punchy, and the building colors a more accurate brown. However, there is some banding present and graininess in the Google camera, which Samsung's software seems to eliminate.

VK4tSQUl.jpg

Samsung S8 Stock Camera vs Google Camera with HDR+. Both on Auto settings. Again, there is some banding present in the Google camera, which Samsung's software seems to eliminate.

There are a few caveats. It’s tweaked software, so you may encounter some bugs. You will have to switch back to your manufacturer’s camera for any custom features, and HDR+ only works on the main rear camera. You’re out of luck if you want to use the wide angle camera on the V30 or the telephoto on on the OnePlus 5, for instance. And it’s not always better; Google’s Camera isn’t optimized for non-Google devices, so there are occasionally strange artifacts like the extra graininess and flare on the S8.

Still, it’s a general improvement across the board, and I can only hope Google opens up its processing to other devices. If you’ve been unhappy with your cell phone camera, it’s at least worth a shot.



All the photos are from the web and the copyright retains with the original author. If there is any problem, please contact us.
You Might Also Like
Xiaomi teams up with Google on the $234 Mi A1, its first Android One phoneXiaomi teams up with Google on the $234 Mi A1, its first Android One phoneThe juicy report that did last month is true: Xiaomi has indeed partnered with Google for its next phone, an Android One handset dubbed the Mi A1 that’s priced at Rs. 15,000 ($234) and will be available across Asia, Europe, and Mexico later this month.
Google's Pixel camera software has been made to work on other recent Android phonesGoogle's Pixel camera software has been made to work on other recent Android phonesWe’re big fans of the Google Pixel’s camera, which uses some complex software to take its fantastic images. A Ukrainian app developer has found a way to port that software to any device that uses a Snapdragon 820/821 or 835 processor, so users can take their own HDR+ pictures without the Google Pixel.
Google introduces Lens, an AI in your camera that can recognize objectsGoogle introduces Lens, an AI in your camera that can recognize objectsGoogle announced at it’s I/O conference today that it’s working on a new form of AI called Google Lens, which understands what you’re looking at and can help you by providing relevant responses.

Comments

   
Post
more>>