Veronica With Four Eyes

Using the Google Assistant Camera with Low Vision

As I was walking through a garden while visiting a school, I found myself wondering what type of flowers I was looking at. I was a bit worried to take a closer look, since I didn’t want to get stung by a bee, so I took out my phone, took a quick picture, and immediately had access to the name of the flower, high resolution images (read more about the importance of high resolution images here), and even a thorough description. This is just one of the dozens of times a day that I use the camera feature built into the Google Assistant app, and after using it for a few months, I can’t imagine life without it. It’s way better than Siri and one of the main reasons I prefer Android over iPhone as a user with low vision.

Google assistant identifying a alphonia

About Google Assistant

Google Assistant is an app that is built into Android phones and is available for download for iOS devices as well, though it doesn’t have the camera feature. Google Assistant allows users to search information, make calls, open apps, get personalized information such as weather and recommended articles, and bookmark information. There is also a camera function that allows users to search for information based on a picture, which is my favorite part of the app and the subject of this post.

My device

I use the Google Assistant app built into my Google Pixel 2, which runs Android 8.1, also known as Android Oreo. I have large text on my phone enlarged 300% using the app Big Font- learn more about third party apps that make Android accessible here. I also use the Select-to-Speak screen reader, though tested this app with TalkBack as well.

Interface

The camera feature is accessed by tapping the camera icon in the bottom right corner of the Google Assistant app. The camera layout is very familiar, with users just tapping the screen to take a picture. After processing the picture, information is displayed about its contents. It’s worth noting the pictures aren’t stored on the device unless the user takes a screenshot.

Accessibility

Information about images is displayed in large text, with the option to view more about an image in the web browser. Text can also be read out loud with Select-to-Speak or TalkBack. The order information was read in was a bit confusing though- if there was text in the image, it would be read out loud after all of the other information had been read, but it’s worth noting I would consider myself an advanced beginner when it comes to using screen readers, since I recently discovered the “fast forward” button.

Reading signs

While I don’t use it for large amounts of text, Google Assistant is great for reading signs or other small amounts of information quickly. The app recognizes text quickly and easily, and can also identify products or objects simultaneously. The text does not need to be perfectly aligned for this function to work.

Identifying products

If a product has a barcode or QR code on it, then the Google Assistant camera can identify it. I like to use this function for determining the nutrition facts of a product, because I can just scan the barcode and then find additional information on a product. Besides food, I will also scan barcodes for books so I can get the ISBN number, and then search for the book on Bookshare so I can get it in an accessible format- read more about Bookshare here.

Google assistant identifying yogurtNutrition facts of yogurt

Learning more about art

When visiting the National Museum of the American Indian with my friend, we had the opportunity to see several different types of art, though I found some of the smaller details difficult to see. Luckily, I was able to use the Google Assistant camera to take a picture of the art and get access to a high resolution image and more facts about the object. My friend also really liked this function because we were able to learn a lot more about the crafting technique. Read more about my visit to the National Museum of the American Indian here.

Screenshot of google assistant identifying a gold band

Identifying objects

Is that a trash can or a person? Almost every person with low vision or blindness has asked themselves this question, and Google Assistant’s camera can help to answer it. The app can identify everyday objects and help users to identify them quickly. It reminds me a lot of the scene identification feature in the Microsoft Seeing AI app- read more about Seeing AI here.

Animals and plants

One of the functions I have had the most fun with is the plants and animals identification feature. While it isn’t always accurate, I enjoy being able to take pictures of animals and get a closer look without risk of being bitten. There have been a few times where the app hasn’t recognized certain animals due to poor lighting or other objects in the way, but I would say it gets the species right 90% of the time.

Google assistant identifying a peacock

Verdict

Having Google Assistant built into my phone is one of my favorite parts of Android, though I wish that the camera button was larger and/or easier to find. I actually prefer Google Assistant to Siri when it comes to voice assistants because of the Google Camera. As I mentioned before, I can’t imagine life without this app, and highly recommend it for everyone.

Using the Google Assistant camera with low vision. How the Google Assistant camera can help visually impaired android users



2 thoughts on “Using the Google Assistant Camera with Low Vision”

Leave a Reply


%d bloggers like this: