HomeDigital MarketingGoogle's Liz Reid Says LLMs Unlock Audio And Video Indexing

Google’s Liz Reid Says LLMs Unlock Audio And Video Indexing

In a podcast interview, Google VP of Search Liz Reid described two methods LLMs are altering what Google can index and the way it ranks outcomes for particular person customers.

Reid advised the Entry Podcast that multimodal AI fashions now enable Google to grasp audio and video content material at a deeper degree than was beforehand attainable. She additionally pointed to a future the place search outcomes adapt based mostly on a person’s paid subscriptions.

What’s New

Multimodal Understanding Is Increasing What Google Can Index

Reid mentioned LLMs being multimodal has opened up content material codecs that Google beforehand struggled to course of.

Reid advised the hosts:

“The beauty of LLM is that they’re multimodal. So we are able to truly perceive audio content material and video content material truly at a degree we couldn’t years in the past.”

She went additional, describing how Google can now transcend fundamental transcription when analyzing video.

“Now you may perceive audio a lot better. Now you may perceive video a lot better. Now you may perceive not simply the video transcription however like what’s the video extra about or what’s the model or different issues like that.”

Reid related this to a long-standing hole in how search works for non-English audio system. For customers in India who converse Hindi or different languages, the online typically lacks the knowledge they want of their language. Beforehand, translating all internet content material into each language wasn’t scalable. LLMs modified that.

“Now with an LLM, you may take info in a single language, perceive it, after which output in one other language. Like that opens up info.”

Google has been transferring on this route for a while. In October 2025, Reid advised the Wall Avenue Journal that Google had adjusted rating to floor extra short-form video, boards, and user-generated content material.

The feedback additionally add context to Google’s Audio Overviews experiment launched in Search Labs final June, which generates spoken AI summaries of search outcomes.

That wasn’t attainable just a few years in the past. In 2021, Google and KQED examined whether or not audio content material could possibly be made searchable and located that speech-to-text accuracy wasn’t excessive sufficient, notably for correct nouns and regional references. Reid’s feedback counsel that the barrier has fallen.

Subscription-Conscious Search Might Change How Outcomes Are Customized

Reid additionally outlined a route for personalization that goes past Google’s current Most popular Sources characteristic.

She advised the hosts Google desires to floor content material from shops a person pays for, not paywalled outcomes from sources they will’t entry.

“In case you love this supply and also you do have a relationship with it then that content material ought to floor extra simply for you on Google.”

Reid gave a sensible instance. Say 20 interviews on a subject are paywalled however a person subscribes to 1 outlet. Google ought to make it straightforward to search out the one they will learn.

“We should always floor the one which they’re paying for and never the six that they will’t get entry to extra.”

She urged the corporate has “taken small steps thus far however wish to do extra” to strengthen how audiences and trusted sources join by way of search. She additionally talked about the opportunity of micropayments for particular person articles, although she acknowledged that mannequin hasn’t taken off traditionally.

Google expanded Most popular Sources globally for English-language customers in December, and introduced a characteristic that highlights hyperlinks from customers’ paid information subscriptions. Google mentioned it could prioritize these hyperlinks in a devoted carousel, beginning within the Gemini app, with AI Overviews and AI Mode to comply with. On the time, Google mentioned customers who choose a most well-liked supply click on to that website twice as typically on common. Reid’s feedback counsel the corporate sees subscription-aware search as a broader evolution of that very same route.

Why This Issues

The multimodal capabilities Reid pointed to increase which content material codecs get found by way of search. Podcasts, video collection, and audio-first content material have traditionally been tougher for Google to guage past metadata and transcripts. Google’s rising skill to evaluate relevance and depth from audio and video immediately modifications who may be discovered by way of search and the way.

For manufacturers and creators investing in non-text codecs, Google’s skill to floor that work is catching as much as the place the viewers already is.

The subscription-aware personalization route issues for any writer with a paywall or membership mannequin. Search outcomes that adapt to what particular person customers pay for would tighten the connection between subscriber retention and search visibility. Paywalled content material might carry out higher for the viewers that issues most to the writer, somewhat than being deprioritized as a result of most customers can’t entry it.

Wanting Forward

Reid didn’t connect timelines to both improvement. The multimodal indexing capabilities she talked about look like present, whereas the subscription-aware personalization is a acknowledged route with some current options already in place.

Google I/O is scheduled for Might 19-20. Reid mentioned on the podcast that the corporate is “actively constructing” however that the tempo of AI improvement means some options might come collectively as late as April and nonetheless make it to the stage.


Featured Picture: Mawaddah F/Shutterstock

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular