Google Lens Is A Super Enhanced Google Goggles — Remember That?


Google is back with the Google Googles — apart from it uses advanced photo recognition technology, and it’s called Google Lens.

CEO Sunday Pichai announced the launch Google Lens at the I/O developer conference on Wednesday, and demoed the feature, which will live on Google Assistant — Google’s version of Siri and Alexa. Google Lens uses the company’s computer vision and AI technology to benefit users learn more approximately their environment in real-time.

Google Lens is reminiscent of Google Goggles, which is an app that lets you Google search by taking photos of objects. You could engage a photo of a hotdog and the app would narrate you whether it’s a pungent dog or not — it was the original “Not Hotdog” from HBO’s “Silicon Valley.” The feature is also similar to the company’s World Lens, which translates signs in foreign languages as whether it was magic.

In the demo of Google Lens, a user is seen pointing their phone camera at a plant and Google identifying what species it is. Google Lens will also pull up the name, rating, and listing information for restaurant and shops whenever users point their camera to a storefront. Yet, the coolest thing in the demo was when the user snaps a photo of a sticker on a router and instantly connects to the Wi-Fi network.

https://twitter.com/Google/status/864891667723300864

Google Lens will first become available as a Google Assistant feature, but thanks to the app now being available on iOS, anyone can exercise the smart camera regardless of their phone model.

The feature functions similarly to other apps with augmented reality such as Snapchat, Instagram, and Pokémon fade — you just point, tap, and fade. Pichai didn’t say precisely when the feature will roll out, but it will eventually beget it into Google Photos and whole other Google products.



Source link