Initial release October 4, 2017
Development status Active on Google Photos; integrated into Google Assistant on Pixel devices.
Google Lens is an app announced by Google during Google I/O 2017, designed to bring up relevant information using visual analysis.
When directing the phone’s camera at an object, Google Lens will attempt to identify the object and show relevant search results and information. For example, when pointing the device’s camera at a Wi-Fi label containing the network name and password, it will automatically connect to the Wi-Fi source that has been scanned. Lens is also integrated with the Google Photos and Google Assistant apps. The service is similar to Google Goggles, a previous app that functioned similarly but with lesser capability. Lens uses more advanced deep learning routines, similar to other apps like Bixby Vision (integrated in the newer Samsungsmartphones) and Image Analysis Toolset (available on Google Play); artificial neural networks are used to detect and identify objects, landmarks and to improve optical character recognition (OCR) accuracy.
Google officially launched Google Lens on October 4, with app previews pre-installed into the Google Pixel 2. In November 2017, the feature began rolling out into the Google Assistant for Pixel and Pixel 2 phones. A preview of Lens has also been implemented into the Google Photos app for Pixel phones.
On March 5, 2018 Google officially released Google Lens to Google Photos on non-Pixel phones. Support for Lens in the iOS version of Google Photos was made on March 15, 2018.