At Google’s annual I/O Developer Conference, the search giant announced an incredible new technology: Google Lens.

With Google Lens, your Google Assistant will be able to have a conversation about what you see… or rather, what your camera sees. And not only will it have a conversation, it’ll perform actions that give you useful information—as well as use visual context to continue a smooth interaction that provides even more info.

During the demo, Google showed off a number of examples of what happens when you point your camera at multiple things, like: instantly identifying a flower, showing you reviews of a restaurant, or automatically connecting to a Wi-Fi router. They also showed off how Google Lens can help provide translations of foreign languages.

One final example Google showed off was how Google Lens can also help with various tasks. By pointing the camera at a marquee, Google Assistant was not only able to instantly recognize what it was looking at, but provide multiple, actionable things—such as buying tickets to the show, adding the event to his calendar, playing music from the band, and more with just a tap… and without having to open any apps.

And this feature won’t just be available to what the camera is looking at in the moment. Google Lens will also be available as an option in your Google Photos. So you can go back and see what building you took a picture of, that piece of art you thought was so beautiful, and even tap a phone number in a screenshot to call it.

While Google didn’t announce when Google Lens would be available, they did say (like most everything else) it’s coming “soon”. And it’ll work with both Android and iPhone devices.

 

Related Articles

Enjoy Technology Completes Business Combination to Become a Publicly Traded Company Reinventing “Commerce at Home”

Join The Phoenix Team

Join The Kansas City Team

All Categories

Tech

Stories

News