Veronica With Four Eyes

Google Lookout App For Low Vision

I started using the Google Lookout app for low vision and blind users in 2019 while attending the ATIA assistive technology conference. Since then, Google Lookout has added several different features for their visual assistance app that can help users with visual impairments get information about their surroundings. Here is an overview of the Google Lookout app for low vision, and how I use it in a variety of different contexts.

What is Google Lookout?

Google Lookout is a free visual assistance app for Android that uses AI and image recognition technology to identify objects and read text through the phone camera for users with visual impairments, inclusive of blind and low vision users. While it is not a replacement for mobility aids like a guide dog or white cane, it can help users get information about objects in their environment and getting information about indoor spaces. Google Lookout requires an internet connection in order to work and can be used on wifi or data connections.

Google Lookout can be installed on any Android phone running Android 6.0 (Marshmallow) or later. Currently, I have Google Lookout installed on my Google Pixel 5.

Related links

Google Lookout vs Google Lens

What’s the difference between Google Lens and Google Lookout? Both applications use the device camera and support uploading images from the gallery and generate visual descriptions of various items. A Google employee explained the difference between the two apps by comparing Google Lens to a friend who would provide descriptions of items only when asked, while Google Lookout is more like a friend that is constantly talking and provides descriptions without being prompted. Both apps are useful and can be used with assistive technology like Select-to-speak, TalkBack, and large print.

Related links

How to use Google Lookout

Setting up Google Lookout

In order to use Lookout, users will need to sign in with their free Google account before using the app for the first time. If users have multiple Google accounts on one device, they will be prompted to choose the account they wish to use.

Google Lookout interface

The default view for Lookout is a live stream of the device’s back camera with the Text mode enabled for reading text out loud. The camera stream can be turned on and off by selecting the Camera button. To change display modes, users can open the Select Mode menu in the top left corner of their screen and select the option of their choice.

The Lookout app is self-voicing, so users do not need to enable a screen reader or text-to-speech to listen to descriptions.

Users can also view a history of descriptions provided by Lookout in the Recents section of the app, which provides text-based transcriptions that can also be read out loud with the self-voicing feature. These descriptions are deleted when the Lookout app is closed.

Input options for Google Lookout

There are a few different options for uploading/inputting content to be described by Lookout, including:

  • Using the device camera in the Lookout app. Only the back camera is supported, users cannot switch to the front-facing camera, and users will need to hold the device in place to continue receiving descriptions of an object/document
  • Sharing an image from the device gallery by selecting the Share button and selecting the option for Lookout. This generates an image description and transcribes any text in the image
  • Uploading an image from the device gallery using the Image Mode option within the Lookout app.
  • Scanning single-page documents with Document Mode, which extracts the text from the document and displays it in a simplified reading display. The text can be copied into another app if needed, but it isn’t a scanned copy of the original document.

Google Lookout languages

Lookout supports over 30 languages, and users can change languages at any time by selecting an option from the Change Language menu pinned at the top of the screen for Text, Document, Explore, and Images modes. However, Lookout does not translate text.

Related links

Google Lookout modes for visual assistance

Lookout offers the following feature modes for visual assistance, which can be accessed by opening the Select Mode menu. A short description of each mode is available by selecting the Help button (which looks like a lowercase I) in the top right section of the screen.

Text

Text mode reads text that is visible within the view of the device camera out loud, which can include environmental text or short amounts of text like cards or flyers. Text can also be viewed in the Recents section, and the text display can be customized by adjusting the font size, text background, line spacing, and line height in the Reading settings menu in the bottom right corner.

Document

Document reads text that is visible in a photo taken by the user within the Lookout app, though users will need to scan each page individually. Once the text has been identified, users can then read the text in a simplified display, listen to text read out loud, or export text by selecting the Share icon at the bottom of the screen. Just like with Text mode, the text display can be customized by adjusting the font size, text background, line spacing, and line height in the Reading settings menu in the bottom right corner.

Explore

Explore identifies various objects that are within view of the camera, providing information on their location and estimated distance from the camera. The location of items is given in clock position, and provides one or two-word descriptions of various items in the environment, i.e water bottle, keyboard, jacket, etc.

Currency

Currency identifies various denominations of paper money, including US dollars, euros, and Indian rupees.

Food labels

Food labels identifies products via barcode or from an image of food packaging. Food labels can recognize various products from all around the world, including places in North America, South America, Europe, and Australia.

Images

Instead of using the back camera to get information, Images mode allows users to upload images from their gallery to get visual descriptions or identify text.

Other settings options

Users can configure additional details for the Lookout app by opening their Google account within Lookout and selecting the Settings menu. Examples of settings that can be changed include:

  • Automatic flashlight for low-light environments
  • Document scan hints/guidance for scanning images in Document mode
  • Haptic feedback
  • Reading settings for the simplified display, including background color, font, letter spacing, line height, and font size
  • Karaoke highlighting that highlights words as they are spoken
  • Manage modes for hiding modes on the main screen
  • Text-to-speech output settings

Related links

Ways that I use the Lookout app with low vision

Some of the ways I have used various features of the Google Lookout app with low vision on my phone include:

  • Scanning multiple barcodes to identify different snacks next to my desk
  • Reading notes on a bulletin board, which I could copy and paste for later reference if needed
  • Sorting mail and reading items like letters and cards
  • Enlarging a cake recipe and instructions booklet for using a specialty pan
  • Reading a schedule that was posted outside
  • Identifying the location of various obstacles in my friend’s apartment

Related links

Ways that I don’t use the Lookout app with low vision

Lookout isn’t the only visual assistance app that I use, and there are a few tasks where Lookout isn’t particularly helpful and I use a different app instead. This can include:

  • Extracting text and translating it from one language to another. I prefer Google Lens or Envision for translating text since they support additional languages.
  • Identifying objects via visual search. If I’m in a museum and want to know more about something on display, or want more descriptive information about a plant or other object, Google Lens is again a better option for visual searches.
  • Getting descriptions of images or scene descriptions. I prefer the Microsoft Seeing AI app for getting information about images as it is more detailed- Google Lookout typically gives shorter descriptions
  • Helping me locate very small items. Visual assistance apps that use human interpreters like Be My Eyes and Aira are more helpful for locating small items, such as earrings, Lego pieces, pills, or other objects that may not be easy to see.

Related links

More resources for using Google Lookout with low vision

AN overview of the Google Lookout app for low vision and blind users, a free visual assistance app that can read text, describe objects, and more



1 thought on “Google Lookout App For Low Vision”

Comments are closed.