Google announced many things at its Google I/O tech fest in California on Tuesday, from Google Assistant somehow knowing when someone is giving it commands (the activation words “Okay Google” are no longer needed) to adding a “politeness feature” that gives users credit for saying “please.”
Among these announcements was a demonstration of the new features with the Google Lens.
Google Lens, first announced in 2017, is a Shazam-like app for your camera that can identify almost anything your camera sees — like a seeing search engine.
Powered by Google’s artificial intelligence foundation, Google Lens works by automatically identifying objects when users open the camera app on their phones. In a demonstration, Google Lens was able to identify an album by the band Justice and the brand of a T-shirt, according to a video published by The Verge.
“You open the camera and you start to see [Google] Lens surface proactively all the information instantly and it even anchors that information to the things that you see,” said vice president Aparna Chennapragada at Google I/O. “This is an example of how the camera is not just answering questions, but it is putting the answers right where the questions are.”
Calls for regulating the tech industry to protect users’ privacy have been indirectly met with Google’s announcement that it will collect more information from users when they use Google Lens. Google said it will collect a version of the images users viewed and store them in your account, according to Wired.
The images and audio Google stores can be deleted, the company said, by using Google’s privacy settings.
In addition to the Google Lens update, Google announced a feature that can make business calls for users, called “Duplex.” The Google Assistant feature uses lifelike audio to, for example, book a haircut. It even says “umm” and “mm-hmm” to help it pass as a real person.
Send tips to [email protected]