Meta makes three major updates to its Ray-Ban glasses, including live AI

Meta Platforms has added new features to its Ray-Ban glasses, including real-time live AI and Shazam on top of the already onboarded features.

This comes as the social media giant has been constantly upgrading the AI-enabled glasses to enhance performance including to handle more complex tasks and respond more naturally in major updates that are expected to transform the smart glasses.

Meta made significant upgrades to Ray-Ban

Earlier this year Meta revealed it was integrating its next-generation AI model Llama 3 into virtual assistant – MetaAI in its Ray-Ban smart glasses to enhance performance. The multi-modal features were in early access last December and can perform translations in addition to identifying objects, animals, and monuments.

Now, Meta has brought other major upgrades to the smart glasses to boost their performance further. According to Cnet, the always-on continuous AI assistance started working on the Ray-Ban glasses on Monday for owners that have access to Meta’s features. This is in addition to the onboard translation, and Shazam which is currently available in the US and Canada only.

The latest features added to the AI-enabled glasses include continuous audio and camera recording as opposed to specific individual prompts. This, according to Cnet, allows the glasses to be used for an extended period of time with the AI features turned on.

Whenever the always-on live AI is activated, the glasses’ LED also stays on. Meta keeps a recording of the conversation that can be referred to throughout the AI session.

In terms of translation, it is expected to work automatically while talking. However, although the translation comes through the glasses, it comes with a slight delay.

The live AI assistance is reportedly similar to what Google demonstrated on its own prototype glasses this month via Gemini and Android XR, arriving next year.

According to Meta, as cited by Cnet, the aways-on AI takes a hit on battery life, and users can expect up to 30 minutes of use before recharging the device.

However, the company further explains that this type of always-on camera-assisted AI is exactly what more tech companies will be exploring in the next year.

AI glasses hit their stride in 2024

The upgrades come as Big Tech is pushing AI assistants as the raison d’etre for smart glasses. Last week, Google revealed Android XR for new smart glasses, and specifically positioned its Gemini AI assistant as the killer app.

In the meantime Andrew Bosworth, Meta CTO said in blog post that “2024 was the year AI glasses hit their stride.”

In the same blogpost, Bosworth further opines that smart glasses may be the best possible form factor for a “truly AI-native device.” He added the AI-powered glasses could be the first hardware category to be “completely defined by AI from the beginning.”

With Meta’s Ray-Ban smart glasses, users can activate the virtual assistant on the glasses by saying “Hey Meta” then ask a question or prompt, before the assistant can respond via speakers built on the frames.

According to Meta, users can livestream from the glasses to social media platforms like Facebook and Instagram, using “Hey Meta” to engage with the firm’s “advanced conversational assistant” MetaAI.

The Ray Ban glasses feature enhanced audio and cameras, in addition to more than 150 different custom frame and lens combinations, “and they are lighter and comfortable.”

From Zero to Web3 Pro: Your 90-Day Career Launch Plan

Stay up to date

on all important crypto news!

The most important news, once a week. No spam.