Google is testing augmented reality glasses for translation and navigation


After its Google Glass project, abandoned in 2015 then came out of the cupboards in 2017 with a version intended for businesses, Google is resuming its research on connected glasses.

Google wants to continue exploring the potential of augmented reality in our daily lives. These tests will serve, at the same time, to measure the appetite of the public in 2022 for this technology remodeled to stick to translation and navigation.

Google’s prototypes are rolling out of the labs for real-world testing “by next month,” according to a blog post published this week. A small group of “Googlers” in the United States will be able to test augmented reality devices in certain everyday situations, such as transcribing a restaurant menu or locating a café around the corner.

Detect sound and image in real situation

Google glasses integrate a screen, visual and sound sensors. The firm thus wishes to test audio detection with the transcription and translation of words and visual detection for the translation of text or navigation in space.

Augmented reality “can help us quickly and easily access the information we need, such as understanding another language or knowing how to get from point A to point B,” the company says.

With this project, Google intends to capture with more precision certain factors “difficult or even impossible to recreate entirely indoors”, such as the weather and busy intersections.

Google opts for caution

The tests will start on a small scale and the capacities of the prototypes will be “limited”, assures Google. In particular, it will be impossible to film and photograph the scenes.

In other words, Google wants to take it easy. This caution also applies to the security and confidentiality of the data of users and those around them, underlines the firm. The testers will thus undergo training on the devices, protocols, confidentiality and security.

After the experiment ends, the data will also be deleted “unless used for analysis and debugging.” In this case, “the image data is first cleaned of any sensitive content, including faces and license plates”, then is “stored on a secure server, with access limited to a small number of Googlers for analysis and debugging”, before being deleted after 30 days.

Google’s work is not isolated. We are witnessing a democratization of immersive experiences in augmented reality at the time of the metaverse. Apple, Meta with Ray-Ban or Snap, the parent company of Snapchat, are also getting involved in order to find a place in the market.





Source link -97