Home Money Meta has lagged behind in the smartphone market. Can smart glasses make up for it?

Meta has lagged behind in the smartphone market. Can smart glasses make up for it?

0 comments
Meta has lagged behind in the smartphone market. Can smart glasses make up for it?

Meta has dominated online social connections for the past 20 years, but it failed to make the smartphones that primarily provided those connections. Now, in a multi-million-dollar, multi-year effort to position itself at the forefront of connected hardware, Meta is betting on face computers.

At its annual Connect developer event today in Menlo Park, California, Meta showed off its new, more affordable Oculus Quest 3S virtual reality headset and its upgraded, AI-powered Ray-Ban Meta smart glasses. But the star of the show was Orion, a prototype for holographic display glasses that CEO Mark Zuckerberg says has been in development for 10 years.

Zuckerberg stressed that the Orion glasses, which are currently available only to developers, are not typical smart displays, saying that such glasses will be so interactive that they will take the place of the smartphone for many purposes.

“Building this display is unlike any other display you’ve ever used before,” Zuckerberg said on stage at Meta Connect. Meta’s chief technology officer, Andrew Bosworth, had described this technology as “the most advanced thing we have ever produced as a species.”

The Orion glasses, like many heads-up displays, seem like the fever dream of the tech utopians who have been hard at work in a top-secret location called the “Reality Lab” for the past few years. A WIRED reporter noted that the thick black glasses looked “robust” on Zuckerberg.

As part of the on-stage demo, Zuckerberg showed how the Orion glasses can be used to project multiple virtual screens in front of someone, quickly respond to messages, video chat with someone, and play games. In the messaging example, Zuckerberg noted that users won’t even have to pull out their phones. They’ll navigate these interfaces by speaking, tapping their fingers, or simply looking at virtual objects.

There will also be a built-in “neural interface” that can interpret brain signals, using a wrist-worn device that Meta first unveiled three years ago. Zuckerberg didn’t elaborate on how all this will actually work or when a consumer version might materialize. (He also didn’t go into detail about the various privacy complications involved in connecting this device and its visual AI to one of the world’s largest repositories of personal data.)

He said the images appearing through the Orion glasses are neither a pass-through technology (in which external cameras show users the real world) nor a screen displaying the virtual world. It’s a “new kind of display architecture,” he said, that uses projectors on the temples of the glasses to shoot waveguides at lenses, which then reflect light into the user’s eyes and create volumetric images in front of them. Meta has designed this technology, he said.

The idea is that images will not appear as flat 2D graphics in front of your eyes, but that virtual images will now have shape and depth. “The big innovation with Orion is the field of view,” says Anshel Sag, principal analyst at Moor Insights & Strategy, who was present at Meta Connect. “The field of view is 72 degrees, which makes it much more attractive and useful for most applications, whether it’s gaming, social media, or just content consumption. Most headsets have a range of 30 to 50 degrees.”

You may also like