Google announced that its Gemini Live has begun rolling out new features that support answering user questions in real time through smartphone cameras and “seeing” the user’s screen. These functions are the result of Google's "Project Astra" project in the past year. Google spokesman Alex Joseph said the updated Gemini Live can provide users with smarter support. Some users shared their experience on Xiaomi phones on Reddit, and 9to5Google also reported it. The demo video shows Gemini's new screen reading capabilities.
The update includes two important features that will be pushed to Gemini premium subscribers later this month as part of the Google One AI Premium program. Functions include real-time interpretation of screen content through the mobile phone camera and accurate answers to user questions. In the demo video, users asked about the color choices of newly glazed pottery, and Gemini provided suggestions in real time.
Google's move consolidates its leading position in the AI assistant market. Amazon is launching an upgraded version of Alexa Plus, while Apple delays Siri's upgrade. Although Samsung continues to promote the Bixby Assistant, Gemini is still the default assistant for Google phones. With the advancement of AI technology, user interaction experience will be more intelligent and convenient, and artificial intelligence assistants will be closer to actual needs in the future.