Google I/O Roundup: Search-by-picture Google Lens & other Google Assistant features announced
The ability to search-by-picture or translate-by-picture gives Google Assistant a powerful advantage against other digital assistants.
Today at its Google I/O 2017 developers’ conference, Google announced Google Lens, a new way to search for information via pictures, which is integrated into Google Assistant. Google Assistant is also getting the ability to type queries into it, plus it is coming to iOS.
Google Home is getting a notification feature called Proactive Assistance, hands-free calling in the next few months, and Visual Responses that allow for answers to be cast from the device to phones or TVs.
Google also formally announced Google for Jobs, a job search engine that we reported being spotted in April.
Below are stories from us with more information on these announcements:
- Google announces Google Lens, an AI-powered visual search tool
- Google Assistant comes to iPhone, adds alerts, hands-free calling & more
- Google confirms “Google for Jobs” job search rolling out in coming weeks
Google also said it is working on a standalone VR device, an all-in-one device that doesn’t require a phone or being hooked to a computer to work. You can read more about that from Google here.
There were also announcements relating to Gmail, Google Photos and Android. Google covers them in its blog post here. You can also watch the Google I/O opening keynote where these were made or read our live blog of the keynote below:
Opinions expressed in this article are those of the guest author and not necessarily MarTech. Staff authors are listed here.
Related stories