Google has updated its Lookout app to work with smart glasses. The app now uses camera feeds from compatible smart glasses to identify objects in real time. This feature helps people who are blind or have low vision navigate their surroundings more easily.
(Google’s Lookout App Identifies Objects via SmartGlasses Camera Feeds.)
The Lookout app runs on Android devices and uses the phone’s camera to recognize text, products, and surroundings. With this update, users can connect the app to smart glasses that support Android. The glasses send live video to the phone, and Lookout processes the images instantly. It then gives spoken feedback through the phone or connected headphones.
Users do not need to point their phone at an object anymore. The smart glasses act as a hands-free camera, letting people look around naturally while the app works in the background. Google says this makes everyday tasks like finding a door, reading a label, or spotting a bus stop much smoother.
The app uses on-device machine learning to keep things fast and private. No video is sent to the cloud. Everything happens right on the user’s phone. This means responses are quick and personal data stays secure.
Lookout supports multiple languages and works offline for core features. Google designed it to be simple and reliable. The update is rolling out now to all Android users who have compatible smart glasses.
(Google’s Lookout App Identifies Objects via SmartGlasses Camera Feeds.)
People who rely on assistive technology can now get help without holding a phone. The integration with smart glasses offers a more natural way to interact with the world. Google continues to improve accessibility tools so more people can use them every day.




