x
Breaking News
More () »

Smart glasses, mixed reality and AI assistants take center stage at Meta Connect 2024

The event showcased some of Meta's newest advancements in virtual reality, artificial intelligence and more.
Credit: AP
Mark Zuckerberg wears a pair of Orion AR glasses during the Meta Connect conference Wednesday, Sept. 25, 2024. (AP Photo/Godofredo A. Vásquez)

MENLO PARK, Calif. — Meta kicked off its 2024 Meta Connect event Wednesday, showcasing some of its newest technological advancements.

With devices offering real-time language translation, new steps in both mixed and virtual reality and new advancements in Artificial Intelligence (AI), here are a few of the biggest takeaways from the beginning of the event.

The Meta Quest 3S

Meta CEO Mark Zuckerberg kicked off the event by revealing its brand new virtual reality headset, the Meta Quest 3S

The new headset is several hundred dollars cheaper ($299.99) while still offering the same mixed reality and performance capabilities of the Quest 3 ($499.99). At the same time, Meta will also be lowering the cost of the 512 GB version of the Quest 3, their current best headset in their virtual reality lineup. 

The new Quest 3S will ship on Oct. 15 of this year, with preorders beginning Wednesday.

"Quest 3 is the best mixed reality device that you can buy today, and I am really proud of it," Zuckerberg said.

While some of the live demo did not go as planned with the device's streaming capabilities crashing temporarily, Meta still was able to show the mixed reality capabilities of the headset. The power of the headset allowed users to essentially create new monitors in their headset at will, looking at multiple screens at the same time.

Alongside the introduction of the new headset, Zuckerberg introduced upcoming changes to Horizon Worlds, such as higher quality avatars and more opportunities to engage in virtual activities with friends and family.

Llama 3.2: Advancements in AI

Zuckerberg continued by introducing updates to their AI assistant. 

While Meta has already implemented AI into its platforms, with free access to each user, the company has updated the capabilities of Meta AI with a new language model called "Llama 3.2 multimodal," with "multimodal" meaning it can understand both images and text.

With this new AI, users can add edits to images by requesting changes as an example. 

They also introduced "natural voice interactions" to Meta AI. 

Meta will begin rolling it out Wednesday and continue over the next few days. The company hopes this will allow users to more naturally interact with AI. Meta AI's new capabilities will be available across all of its major apps: Instagram, WhatsApp, Messenger, and Facebook.

This new addition also will add some prominent voices that can be your AI assistant, with Zuckerberg using Awkwafina as an example, and having a brief conversation with her AI voice on stage. 

Meta also introduced AI Studio, which will allow content creators to create their own AI. While a version of this already existed, Meta wanted to expand upon its abilities. At the showcase, they showed a new version that allows a content creator to answer messages from people and guide conversations when they creator is unable to see to all of their messages. 

Alongside answering questions, they also showed the ability to get on a video call with an AI-version of the creator, allowing them to have a conversation with an artificial version of a creator.

Next, Meta showcased their Meta AI translations, which would automatically dub videos in other languages in order to reach wider audiences. To show this feature, the company brought up a video that was in Spanish from a content creator, and then showed a different version of the video that went through Meta AI translation. Once it was translated, not only did it translate the spoken words, but also animated the lips of the speaker so that it looked as if they were speaking the other language to help those who may be hearing impaired.

Ray-Ban smart glasses

Meta's Ray-Ban smart glasses may not be a new device, but the features being introduced in the near future certainly are. Users will soon be able to control music and audio with their voice on platforms such as Spotify, Amazon music, Audible and iHeart Radio. Similarly to the regular Meta AI enhancements previously mentioned, the Ray-Ban smart glasses will be getting their own artificial updates. 

A later example showed how you can tell the glasses to call a phone number you're looking at or scan a QR code from a flyer. Meta followed up this example by showcasing a video of someone getting real-time help from their assistant when deciding an outfit to wear to a party, remembering certain outfits in someone's closet.

Zuckerberg then showcased the live translation feature by having a conversation with someone on stage who speaks Spanish, and hearing their words translated during the conversation. 

The company also announced Orion, which claims to be "the most advanced pair of AR glasses ever made."

The event will continue into Thursday, with more details about it being found here.

Before You Leave, Check This Out