Android's screen reader can now answer questions about images

Today is Global Accessibility Awareness Day (GAAD), and, as in years past, many tech companies are marking the occasion with the announcement of new assistive features for their ecosystems. Apple got things rolling on Tuesday, and now Google is joining in on the parade. To start, the company has made TalkBack, Android's built-in screen reader, more useful. With the help of one of Google's Gemini models, TalkBack can now answer questions about images displayed on your phone, even they don't have any alt text describing them. "That means the next time a friend texts you a photo of their new guitar, you can get a description and ask follow-up questions about the make and color, or even what else is in the image," explains Google. The fact Gemini can see and understand the image is thanks to the multi-modal capabilities Google built into the model. Additionally, the Q&A functionality works across the entire screen. So, for example, say you're doing some online shopping, you can first ask your phone to describe the color of the piece of clothing you're interested in and then ask if it's on sale. Separately, Google is rolling out a new version of its Expressive Captions. First announced at the end of last year, the feature generates subtitles that attempt to capture the emotion of what’s being said. For instance, if you're video chatting with some friends and one of them groans after you make a lame joke, your phone will not only subtitle what they said but it will also include "[groaning]" in the transcription. With the new version of Expressive Captions, the resulting subtitles will reflect when someone drags out the sound of their words. That means the next time you're watching a live soccer match and the announcer yells "goallllllll," their excitement will be properly transcribed. Plus, there will be more labels now for sounds like when someone is clearing their throat. The new version of Expressive Captions is rolling out to English-speaking users in the US, UK, Canada and Australia running Android 15 and above on their phones.This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/androids-screen-reader-can-now-answer-questions-about-images-160032185.html?src=rss

May 15, 2025 - 17:05
 0
Android's screen reader can now answer questions about images

Today is Global Accessibility Awareness Day (GAAD), and, as in years past, many tech companies are marking the occasion with the announcement of new assistive features for their ecosystems. Apple got things rolling on Tuesday, and now Google is joining in on the parade. To start, the company has made TalkBack, Android's built-in screen reader, more useful. With the help of one of Google's Gemini models, TalkBack can now answer questions about images displayed on your phone, even they don't have any alt text describing them.

"That means the next time a friend texts you a photo of their new guitar, you can get a description and ask follow-up questions about the make and color, or even what else is in the image," explains Google. The fact Gemini can see and understand the image is thanks to the multi-modal capabilities Google built into the model. Additionally, the Q&A functionality works across the entire screen. So, for example, say you're doing some online shopping, you can first ask your phone to describe the color of the piece of clothing you're interested in and then ask if it's on sale.

Separately, Google is rolling out a new version of its Expressive Captions. First announced at the end of last year, the feature generates subtitles that attempt to capture the emotion of what’s being said. For instance, if you're video chatting with some friends and one of them groans after you make a lame joke, your phone will not only subtitle what they said but it will also include "[groaning]" in the transcription. With the new version of Expressive Captions, the resulting subtitles will reflect when someone drags out the sound of their words. That means the next time you're watching a live soccer match and the announcer yells "goallllllll," their excitement will be properly transcribed. Plus, there will be more labels now for sounds like when someone is clearing their throat.

The new version of Expressive Captions is rolling out to English-speaking users in the US, UK, Canada and Australia running Android 15 and above on their phones.This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/androids-screen-reader-can-now-answer-questions-about-images-160032185.html?src=rss