Veronica With Four Eyes

Google Lookout App For Low Vision

When I attended the ATIA conference in January, I stopped by the Google booth to learn more about this new app called Google Lookout for low vision and blind users. One of the friendly Google employees sat down with me and introduced me to what was then a brand-new app, and I have been using it since for a variety of tasks. Here are my tips for using the Google Lookout app for low vision users, and how I have been using it myself.

What is Google Lookout?

Google Lookout is a free Android app that uses AI and image recognition technology to identify objects and read text through the phone camera for users with visual impairments, including blindness and low vision. While it is not a replacement for mobility aids, it can help tremendously with users who have trouble with navigating indoor spaces or avoiding obstacles. Google Lookout requires an internet connection in order to work and can be used on wifi or data connections.

Related links

Google Lookout vs Google Lens

One of the first questions I had when I first was introduced to Google Lookout was if it was different than my beloved Google Lens app, since they are both powered by the same technology. The Google employee who showed me the app said that Google Lens was like a friend who will only provide description when asked, while Google Lookout was more like a friend who would constantly be talking and providing description without being prompted. I have both apps on my phone and while I do use Google Lens more often, Google Lookout is still incredibly useful.

Related links

Download Google Lookout

Below is a download link for the Google Lookout app in the Google Play Store. Google Lookout is only available on select phones and versions of Android, though users looking for similar functionality still have other options, including the Google Lens app. iOS users may prefer using the Microsoft Seeing AI app, which provides identical functionality.

Related links

Interface

Google Lookout is separated into four different modes, with the default mode being Explore mode. Within each mode, users can listen to different types of descriptions read continuously as new objects or information is shown in front of the camera. Users can change the mode they want to use by going to the “Select” tab on the left, use the app camera in the center “Camera” tab, and can look at recent descriptions from the most recent time the app was open in the “Recent” tab on the right.

Explore mode

Explore mode can identify objects that are in front of the camera, and provide information about their distance or location. For example, when I used Google Lookout on my desk, it told me my computer was at 12 o’clock, meaning directly in front of me, and my water bottle was at 3 o’clock, or on my right. It also alerted me when I was approaching the stairs in my house and told me there was a jacket on the stairs, which was actually my sleeping cat.

Related links

Shopping mode

Shopping mode can read different barcodes for products, as well as identify currency when it is held up to the camera. It’s best for users to slowly rotate products until a barcode is in view, and then Google Lookout will read the product name and size, as well as other product information. I used it to identify different food items and it successfully identified each item, though if I wanted additional information I found that I had to get it through the product listing on Google Lens.

Related links

Quick Read mode

The Quick Read mode allows users to read documents, signs, and other environmental text, though it works best when the camera is within close proximity to the text. It’s great for sorting through short amounts of text that doesn’t need to be stored otherwise, or for reading text for short periods of time. If I had to save a document or text, I would use the Microsoft Office Lens app instead.

Related links

Scene Description mode

Unlike the other modes, Scene Description requires the user to press the camera icon to get a description of their surroundings. I don’t use this mode very often, as I prefer to have continuous description if I’m using Google Lookout, though this is helpful for figuring out what is in a room or area before walking there.

Related links

Other settings

Users can adjust the detail level that is shared by swiping to the side and selecting the “change detail level” option to have low, medium, or high detail. There are also options to pause and resume the camera feed when the proximity sensor is covered. Google Lookout uses the same voice that is enabled in TalkBack settings, so users need to change the voice settings within accessibility settings on their phone.

Related links

What I use it for

Some examples of how I have used Google Lookout include:

  • Figuring out what obstacles were in my friend’s room
  • Identifying items on a conference table
  • Scanning multiple barcodes to identify over-the-counter medication on my desk
  • Reading notes on my bulletin board
  • Seeing if a classroom had people inside so I would know I was in the right class

Related links

Final thoughts

The Google Lookout app is an awesome resource for people who are blind or that have low vision, and I hope that it is expanded to more phones in the future. If you have a compatible Android phone, I highly recommend trying out the Google Lookout app to help with identifying objects and text.

Google Lookout App For Low Vision. My tips for using the free Google Lookout app for low vision, an app that can help Android users identify objects and read text