Now, months later, the app has been rolled out for Pixel phone users in the US.
Moreover, Google claims that the app won't always work with 100% accuracy, and it will continue to develop the app as it gets more feedback from users. The app was designed for the visually challenged people and was based on Google's machine learning algorithms.
Apart from identifying the objects through the AI technology, the app is also capable enough to read the text, labels, scan barcodes and much more.
The developers were working hard to improve the functionality and the overall user interface of this app since it was announced last year during the I/O conference last year.More news: Woman Adds 9 Trillion Digits to Pi
More news: Jürgen Klopp: Liverpool back as European force
More news: Fiat Chrysler Recalling Nearly 900K Vehicles Over Emission Standards
Google's Lookout app comes in three modes: Explore, Shopping, and Quick read.
Lookout is created with similar technology previously used in Google Lens.
Although it will only work on Pixel devices in the United States at the moment Google are hoping to bring it to more devices, countries and platforms soon. Once the app is opened, users just need to keep the phone pointed forward - Lookout will then describe the environment out loud for the user to hear.
Judging by today's Google logo (spells "Google" in Braille), the company is either (a) celebrating Louis Braille's birthday (he was born on January 4, 1809), (b) about to engage in some new accessibility initiative, or (c) both. The app is well done from the start: when you launch it, it starts by asking you which mode you want to use, so as not to cumber you with navigating menus. Also, if you like our efforts, consider sharing this story with your friends, this will encourage us to bring more exciting updates for you.