Meta is adding AI to its Ray-Ban smart glasses next month – The Verge

author
1 minute, 33 seconds Read

/

The Ray-Ban Meta Smart Glasses can do things like identify objects, monuments, and animals, as well as translate text.

p>span:first-child]:text-gray-13 [&_.duet–article-byline-and]:text-gray-13″>

a:hover]:text-gray-63 [&>a:hover]:shadow-underline-black dark:[&>a:hover]:text-gray-bd dark:[&>a:hover]:shadow-underline-gray [&>a]:shadow-underline-gray-63 dark:[&>a]:text-gray-bd dark:[&>a]:shadow-underline-gray”>Photo by Amelia Holowaty Krales / The Verge

Meta will bring AI to its Ray-Ban smart glasses starting next month, according to a report from The New York Times. The multimodal AI features, which can perform translation, along with object, animal, and monument identification, have been in early access since last December.

Users can activate the glasses’ smart assistant by saying “Hey Meta,” and then saying a prompt or asking a question. It will then respond through the speakers built into the frames. The NYT offers a glimpse at how well Meta’s AI works when taking the glasses for a spin in a grocery store, while driving, at museums, and even at the zoo.

Although Meta’s AI was able to correctly identify pets and artwork, it didn’t get things right 100 percent of the time. The NYT found that the glasses struggled to identify zoo animals that were far away and behind cages. It also didn’t properly identify an exotic fruit, called a cherimoya, after multiple tries. As for AI translations, the NYT found that the glasses support English, Spanish, Italian, French, and German.

Meta will likely continue refining these features as time goes on. Right now, the AI features in the Ray-Ban Meta Smart Glasses are only available through an early access waitlist for users in the US.

This post was originally published on this site

Similar Posts