Meta recently announced that its Ray-Ban smart glasses will introduce three new features: real-time AI, real-time translation, and Shazam. Among these, the real-time AI and real-time translation features are currently available only to members of Meta's early access program, while the Shazam feature is open to all users in the United States and Canada.

The real-time AI and real-time translation features were first previewed earlier this year at the Meta Connect 2024 conference. The real-time AI allows users to engage in natural conversations with Meta's AI assistant while the glasses continuously observe the surrounding environment. For instance, when browsing the produce section of a grocery store, you could theoretically ask Meta's AI to recommend some recipes based on the ingredients you are looking at. Meta states that, when fully charged, users can use the real-time AI feature for about 30 minutes at a time.

image.png

Meanwhile, the real-time translation feature allows the glasses to perform real-time voice translation between English and Spanish, French, or Italian. You can choose to hear the translation results through the glasses themselves or view the translated text on your phone. You will need to download the language pairs in advance and specify the languages used by you and your conversation partner.

The Shazam feature is more straightforward. When you hear a song, you simply prompt Meta AI, and it should be able to tell you what song is playing. Meta CEO Mark Zuckerberg demonstrated this feature in an Instagram video.

If your glasses have not yet displayed these new features, make sure your glasses are running version v11 software and that your Meta View app is on version v196. If you have not yet joined the early access program, you can apply through the website.

This update comes at a time when tech giants are positioning AI assistants as the core selling point of smart glasses. Just last week, Google released a new operating system, Android XR, specifically designed for smart glasses, emphasizing that its Gemini AI assistant is a killer app. Meanwhile, Meta's Chief Technology Officer Andrew Bosworth also stated in a blog post that "2024 is a year of significant progress for AI glasses." Bosworth further asserted that smart glasses could represent the best form of devices that are "truly native to AI," being the first hardware category fully defined by AI from the start.

In summary, the functionality upgrade of Meta's smart glasses further demonstrates the immense potential of AI in the wearable device sector, while the competition among tech giants will further accelerate the rapid development of smart glasses technology.