Meta has made announcement with the unveiling of its augmented reality (AR) glasses, Orion, previously known by its code name Project Nazare.
Also Read: Logitech Launches MX Creative Console for Photoshop and Premiere Pro
Users can enjoy digital experiences that go beyond the limitations of small smartphone screens. Holographic displays allow users to place 2D and 3D content anywhere within their physical environment.
The AR glasses use advanced artificial intelligence that can sense and interpret the user’s surroundings. The glasses are designed to be worn comfortably in daily life, indoors and outdoors, without compromising on social interactions. Users can see each other’s eyes and expressions while using the glasses.
Users can interact with multiple holographic windows simultaneously such as working on documents while making a video call or adjusting a digital calendar all while being physically present in their environment.
The glasses can project life-sized holographic images, making interactions like remote video calls feel more lifelike and personal.
Meta AI is built into the Orion glasses. This AI allows users to interact with the physical world in smarter ways. Users can open their refrigerator and ask Meta AI for a recipe based on available ingredients or take a video call without using their hands while completing household chores.
The AR glasses looks like an ordinary pair of glasses, resembling sleek, black Clark Kent-style frames. They weigh around 98 grams, which is lighter than mixed-reality headsets like the Meta Quest or Apple’s Vision Pro, but heavier than traditional eyeglasses.
The frames are made from magnesium, chosen for its ability to evenly distribute heat while being lighter than aluminum.
Meta opted for silicon carbide lenses instead of glass or plastic. These lenses are durable, lightweight and have an ultra-high index of refraction, allowing more of your field of vision to be filled with digital content.
The core feature of Meta Orion is its advanced display, which uses custom-built Micro LED projectors inside the frames to project images and graphics onto the lenses via waveguides.
Orion has a 70-degree field of view (FoV), one of the largest in any AR glasses to date. This wider FoV ensures that digital objects remain in view even as you move closer to them.
The glasses are part of a three-part hardware system, which includes, Neural Wristband resembling a Fitbit, the wristband uses electromyography (EMG) to read neural signals from the hand. This allows users to control the glasses with hand gestures such as pinching fingers to select or scrolling.
A small, battery-like device that provides computing power to the glasses. Orion must remain within 12 feet of the puck to function properly as it processes most of the data.
Glasses feature seven embedded cameras, the glasses support hand and eye tracking, anchor virtual objects in the real world and provide a crisp display for tasks like video calls and reading text.
Also Read: OpenAI Introduces Advanced Voice Mode to Plus and Team Subscribers
This wristband interprets neural signals and supports a range of gestures including pinching fingers to select objects, flicking a thumb to scroll or using a coin flip gesture to control content.
The glasses use your gaze as a pointer, while hand gestures serve as the click function. Meta’s AI assistant can also be activated through voice commands, allowing hands-free interaction with the environment.
Meta sees Orion as a communication tool of the future aiming to replace smartphones by allowing users to interact with holograms and AI integrated into the physical world.
The glasses has the ability to facilitate hands-free video calls. During a demo, a call was made from Orion to a colleague’s iPhone. Though the recipient couldn’t see the caller, the plan is to eventually integrate avatars that mimic facial movements during calls.
Zuckerberg envisions a future where people communicate through holographic projections and Orion is a step in realizing that dream.
The AI embedded in Meta Orion is capable of identifying objects in real-time and generating contextual information based on the user’s environment. For example, during the demo, the glasses were able to recognize ingredients on a table and suggest a smoothie recipe with step-by-step instructions.
The AI capabilities in Orion have already been tested in Meta’s Ray-Ban smart glasses, but Orion takes it further by adding a visual, AR-enhanced layer.
In a demo, Meta showed several potential use cases for Orion. The glasses were used for tasks such as scanning QR codes to pair glasses for shared experiences, playing an AR version of Pong and interacting with virtual windows.
The experience was guided and partly pre-set, but testers confirmed that Orion’s AR capabilities were real and not just staged simulations. Despite the demo, Meta executives admitted that the glasses aren’t yet a fully independent product.
Key features like front-facing cameras and GPS remain inactive and much of the experience relies on external computing support.
Meta Orion is not yet ready for mass production or consumer use, primarily due to cost. Each pair currently costs around $10,000 to manufacture, largely due to the difficulty in producing the silicon carbide lenses.
Ray-Ban Meta has already available, these smart glasses offer AI-powered features but no display. Hypernova is expected to launch next year, this intermediate product will offer a smaller heads-up display for lighter interactions with AI and messaging.
Also Read: Amazon Launches Project Amelia, Gen AI For Retailers