Earlier this year, Google rolled out a new feature called AI Mode, designed to help users explore complex topics in greater depth. It allows people to ask longer and more detailed questions—up to three times the length of standard queries. This feature is powered by a customized version of Gemini 2.5 Pro, which works by breaking down large questions into smaller parts and performing multiple searches to deliver comprehensive results.
In a blog post, Google said that following positive feedback, it is now introducing a new feature called Search Live, allowing users to interact with Google Search in real-time. With this tool, users can speak directly to Search or use their camera to ask questions about what they’re seeing. For example, if you’re preparing an iced matcha, you can show your ingredients and ask for the best order to combine them.
Search Live functions similarly to Gemini Live, offering quick, real-time answers that are especially useful for on-the-go queries. To access it, users can open the Google app and tap on the new ‘Live’ button below the search bar. Alternatively, you can use Google Lens and select the ‘Live’ option that appears at the bottom of the screen to start interacting visually.
Alongside the introduction of Search Live, Google is expanding AI Mode to support seven additional Indian languages: Bengali, Kannada, Malayalam, Marathi, Tamil, Telugu, and Urdu. This rollout will begin next week, though availability may vary by user and device, so it could take some time before it appears for everyone.
Recently, AI Mode also received a visual update, making it easier to ask questions in a conversational style while getting visual answers. Users in the United States now have the added ability to upload or capture an image and use AI Mode to ask questions based on what’s in the photo, adding another layer of usefulness to the search experience.
Also Read:
Wall Street analysts explain how AMD’s own stock will pay for OpenAI’s billions in chip purchases








