Ray-Ban / Meta connected glasses can now identify geographic points and tell you about them


Are you going out on the town? Meta’s Ray-Ban smart glasses become your virtual tourist guide, thanks to a new feature.

This feature, which is currently in beta, allows the user to simply look at a landmark (or point of interest) to find out what it is. Meta’s built-in artificial intelligence not only identifies what the user is looking at, but also offers historical information, for example.

Landmark recognition was announced in a post from Meta CTO Andrew Bosworth. In this post, as an example, a user is looking at the Golden Gate Bridge in San Francisco. The glasses not only identify the bridge, but explain that the iconic orange color, called International Orange, was chosen to make the bridge easier to see in fog. Then, after taking a look at the Painted Ladies, a row of colorful Victorian houses, the wearer learns that the houses were all built between 1892 and 1896.

Three examples in action

Mark Zuckerberg followed the announcement with an Instagram post showing the smart glasses in action on the Roosevelt Arch in the mountains of Montana.

Meta didn’t give many additional details or explain where this feature might work. But given that it was shown working in two different locations, it could probably work on any major landmark or building.

“If you do not have access,” Mr. Bosworth wrote, “you can join the waiting list. We are working to make this feature available to more people.”

Share your last interaction via text message, WhatsApp or Messenger

Bosworth also added that a new feature is in the works to easily share your last interaction with Meta AI via text message, WhatsApp or Messenger.

Meta’s Ray-Ban smart glasses were launched last year. A third version of the Meta smartglasses, which focuses more on AR, is expected later this year.

To go further on Ray-Ban / Meta connected glasses


Source: “ZDNet.com”





Source link -97