Google Pixel 3 Can Automatically Take a Selfie When it Detects You Kissing Someone
Google Pixel 3 Can Automatically Take a Selfie When it Detects You Kissing Someone
Redefining the awesomeness of the selfie.

Even though rivals have really upped the game in terms of smartphone photography performance, the Google Pixel 3 and Pixel 3 XL camera remains one of the better options around. A lot of that is because of the extensive use of artificial intelligence (AI). Now, Google is adding some new selfie-focused features to the camera app for the Pixel 3 and the Pixel 3 XL.

A new shutter-free mode in Photobooth in the Pixel 3 Camera app should now make it easier to click selfies, be it just your own selfie, couple selfies or even a group of people. For this, you will need to open the updated Google Camera app, head to ‘more’ and select Photobooth mode. At this point, when you press the shutter button, the artificial intelligence working in the background will automatically take a selfie when it detects that the phone is in a steady position and the subjects in the frame have good expressions and their eyes are open. Hold your peace for that duration, particularly if someone in your group is always fidgety before a selfie.

The algorithms are now trained to detect five common expressions—smiles, sticking the tongue out, kisses, duck face, puffed out cheeks, and the surprised look.

The Photobooth mode also adds the kiss mode—kiss someone, the AI in the Google Pixel 3 and Pixel 3 XL will detect it and immediately take a selfie.

“We worked with photographers to identify five key expressions that should trigger capture: smiles, tongue-out, kissy/duck face, puffy-cheeks, and surprise. We then trained a neural network to classify these expressions,” says Navid Shiee, Senior Software Engineer, Google AI.

A lot of the new features that the Google Camera is now getting with the latest update have emerged from the Clips camera, which relied on artificial intelligence to autonomously detect and click the best moments it detected around it. While the hardware experiment didn’t really work out for Google, those features are now making their way to the camera app on the newest Pixel phones.

Google uses a multi-layer detection process for Photobooth to work when clicking selfies. At the first stage, it filters out closed eyes, talking, or motion blur, or if it fails to detect the facial expressions or kissing. In the second stage, the expressions of each subject are analysed. “Photobooth applies an attention model using the detected expressions to iteratively compute an expression quality representation and weight for each face,” says Google. The final stage involves taking a selfie after all the computations have been done—along with a buffer of more frames which are then compared with the final selfie to see if any of them actually have a better computational score for all the parameters.

The Google Pixel 3 and Pixel 3 XL have dual 8-megapixel front facing cameras.

Original news source

What's your reaction?

Comments

https://shivann.com/assets/images/user-avatar-s.jpg

0 comment

Write the first comment for this!