in

Meta’s new AR glasses offer neural control – no implant necessary

Meta

The headline for Meta’s new fully functioning prototype, Orion (pronounced O-Ryan), basically writes itself. 

They’re “the most advanced glasses the world has ever seen,” Meta CEO Mark Zuckerberg said during today’s Meta Connect event. That’s a bold claim but not one that many will quickly discredit. After all, Meta is coming into Connect hot, having seen success with last year’s Quest 3 and Ray-Ban smart glasses.

Also: Everything announced at Meta Connect 2024: Affordable Quest 3, AR glasses, and more

In some ways, Orion is the best of both worlds, supposedly offering mixed-reality-like computing similar to the Quest 3 in a light, normalized form factor akin to the Ray-Ban smart glasses. Zuckerberg set out five simple yet highly technical requirements when designing Orion: 

  1. They can’t be a headset, meaning there should be no wires or cables dangling off of them, and they should weigh less than 100 grams
  2. The glasses need holographic displays with a wide field-of-view (FOV).
  3. The displays should be sharp enough to pick up details in the real world.
  4. The displays should be bright enough for visual overlays no matter what the environment is like.
  5. AR projections should be able to display a cinema-like screen or multiple monitors.

<!–>

Following these principles means Orion applies holograms to your vision of reality instead of capturing and reimaging what’s in front of you, a process commonly known as pass-through. The big benefit of this technology is the reduced latency, if any.

Being able to visualize incoming messages, video call feeds, and other important information while still being attentive and present in reality solves one of the biggest social problems with modern-day VR headsets like the Quest 3 and Apple Vision Pro.

–> <!–>

Nvidia CEO, Jensen Huang, demoed Orion and suggested that “this is a big deal.”

Meta

Meta says there are three ways to interact with Orion: using voice via Meta AI, hand and eye tracking, and a neural interface. The first two are rather straightforward, but the third option is exactly what’s needed to keep us grounded in prototype land. Orion can work in tandem with a wrist-worn neural interface, registering clicks, pinches, and thumb pushes as inputs.

Also: Meta Quest 3S unveiled at Connect 2024: Everything to know about the VR headset

For example, you can form a fist and brush your thumb on the surface to scroll the user interface, according to CNET’s Scott Stein. Meta says you’ll get a day’s worth of usage before needing to charge the wristband.

That’s promising to hear, considering I’d rather make finger gestures while walking around or sitting down than shout at an invisible voice assistant or wave my arms around in public. According to Meta, Orion runs on custom silicon and a set of sensors, with the battery tucked into the glasses’ arms.

Also: 4 exciting features coming to Meta’s Ray-Ban smart glasses – including the ability to ‘remember’ things

While Orion gives us a glimpse of future AR glasses, there’s still a lot of work to be done before they’re consumer-ready, according to Zuckerberg. Tuning the display system to make it sharper, making the design smaller so it’s more fashionable, and affordability are all aspects that Meta’s CEO would like to develop further. Until it hits the open market, Orion will be available as a developer kit – mostly internally, to build out the software, as well as to a handful of external partners. 

When it’s ready, it’ll be positioned as “Meta’s first consumer, fully-holographic AR glasses,” Zuckerberg said.

–>


Source: Robotics - zdnet.com

Meta Quest 3S unveiled at Connect 2024: Everything to know about the VR headset

Organizations credit regulations and climate tech for their sustainability drive