Ten Years After Facebook Bought Oculus, What’s Next for Meta’s VR Plans? – CNET

2014 doesn’t seem that long ago to me. But in technology, 10 years can be an infinity. I was writing for CNET, like I still am today, but smartwatches weren’t even mainstream yet. And even though Facebook bought Oculus that year, ahead of the expected release of the Oculus Rift, the only VR available in any finished form that year was a phone-connected pair of plastic goggles known as the Samsung Gear VR.

In 2024, VR’s still not massively adopted, but it has gone mainstream several times over. I know plenty of families with Quest 2 headsets doing workouts or using them for games. And now even Apple is exploring virtual and mixed reality with its first headset, the Vision Pro. Meta’s Ray-Ban smart glasses, meanwhile, have started to feel less absurd as a wave of AI wearables coming from startups like Humane and Brilliant Labs emerge into similar territory. Meta is bringing its own AI features to the glasses this month. 

For a trip back in time and a look forward, I spoke with Meta’s chief technology officer Andrew Bosworth about where Meta could be headed next: with the Quest 3, with AI, with smart glasses and with AR glasses to come.

Meta's Michael Abrash stands in front of a wall of VR headsets Meta's Michael Abrash stands in front of a wall of VR headsets

Michael Abrash, chief scientist at Meta’s Reality Labs Research, talks to me in front of a wall of prototype VR and AR headsets in 2022. The final form of AR glasses is still in flux now.

Meta

Meta’s AR glasses may work in all new ways

The most futuristic version of mixed reality devices rely on hand tracking while removing controllers entirely, just like the Vision Pro does today. Meta’s bridging hand tracking and physical controller inputs at the same time. But beyond that, new interfaces loom: neural input wristbands that Meta’s been working on for years, that could be key controllers for future AR glasses, or maybe even other inputs as well.

Bosworth supports customized controllers for the Quest, but beyond that, he admits that it’s going to be a big, strange leap to get to inputs for glasses. “The fastest way to get information from the world into our brains is through our eyes. Getting things from you to the machine is trickier. The fastest is speech, but it’s not the most discreet, and it’s awkward, and weird. And speech is a terrible way to navigate a screen.”

Bosworth admits the future of interfaces, despite Apple making some headway with the Vision Pro’s hand and eye tracking, is still unclear. “Do you want to have a neural interface? Do you want to do air typing? Do you want to do a swiping touchy thing? Are you doing the eye tracking and tapping thing? We’re doing all those things,” Bosworth says of Meta’s Reality Labs research efforts.

“We’re talking about a completely novel architecture. We haven’t really played with a novel architecture since Xerox PARC,” he says, referring to the R&D division that invented the idea of the computer desktop, among many other foundation stones of modern computing.

I ask about Meta’s next conceptual pair of AR glasses, which Bosworth says will be ready to demo at some point in the “not too distant” future. Those glasses, which Bosworth mysteriously promises will be “real-time machines,” aren’t ready for everyday use yet for a number of reasons. 

“Part of the reason they’re prototypes is there’s not a clear path to a consumer-friendly cost structure.” Bosworth points to the displays being big factors in cost and battery life challenges, among other things. “We have a good sense of all the problems. There’s interface and input challenges as well. And we’re making tremendous progress. But I gotta tell you, they are the hardest problems that we’ve tackled in a long time as an industry.”

Watch this: Apple and Meta Are Competing for Your Memories

06:53

Meta could go back to spatial video on its glasses in a future model, Bosworth says, but he isn’t so quick to imagine ways for the glasses to share more things with Quest headsets. Instead, he’s more focused on ways for the glasses to be better at sharing with the rest of the world, people who don’t have headsets or glasses at all.

“The [new] Ray-Ban Metas are optimized around one 12-megapixel camera,” he said. “There’s no stereo camera, no stereo overlap. That may change over time. But I’ll tell you, the No. 1 thing that caused digital photos to become a major part of our lives was our ability to share them. I think there is power in nostalgia, in the relived memory. But I think it’s the secondary power relative to the primary thing of, ‘I want you to experience this now.'”

Leave a Reply