Google pixel 2, a remarkable achievement in the phone was announced last month. Google lens was announced during Google I/O 2017 which allows you to know the information about the object when you point the camera at it. Google officially launched Google lens on October 4, 2017, with the previews pre-installed into Google Pixel 2. Google Lens uses advanced deep learning routines to identify object around us.
Google Assistant has been a revolution in Artificial Intelligence. With the plethora of tasks, it can accomplish and the way it carry on conversation it has been a great assistant. Combination of Google lens and Google lens can be a quick help with the things we see around us. In a real-time, you can learn the about the things around you by tapping Google lens icon and things you're interested in.
Cool features from Assistant and Google lens that can help you :
You can now save information from business cards, follow URLs, call phone numbers and navigate to addresses.
Exploring a new city like a pro from Assistant to help recognize the landmarks and it's history
Learning more about a movie, from the trailer to review and book to see rating and synopsis.
Scanning QR codes and product's barcode is quicker with the Assistant.
Google Lens in the Assistant will be rolling out to all the pixel phones set to English in US, UK, Australia, Canada, India, and Singapore over the coming weeks. Once you get the update, go to the Google Assistant on the phone and tap the Google lens icon in the bottom right corner. Google lens is also integrated into Google photos, so even after you take the picture, you can continue to explore and get the information about the object in the picture.