Between Facebook renaming itself to Meta and Microsoft hailing its Activision Blizzard purchase as an expansion into the metaverse, tech giants have been stirring up the metaverse buzz more than ever. This year, we just might see Apple’s long-rumored augmented or virtual reality headset enter the burgeoning metaverse arena.
Apple could be blending AR and VR with two headsets in the near future, leading the way with some sort of high-end AR/VR headset more like an advanced Quest 2, according to a report last January by Bloomberg’s Mark Gurman. More recent reports from Gurman suggest a focus on gaming, media and communication. It could be expensive, maybe as much as $3,000 or more, with 8K displays, eye tracking and cameras that can scan the world and blend AR and VR together, according to a report from The Information last year.
A November prediction by analyst Ming-Chi Kuo says Apple’s VR/AR headset is arriving in the fourth quarter of 2022 with Wi-Fi 6 and 6E support. Another even more recent report from Kuo in December points to a relatively lightweight device, about 300-400 grams (roughly 10.5-14 ounces). That’s lighter than Meta’s Oculus Quest 2.
This aligns with Kuo’s earlier report saying Apple’s headset is coming in 2022, with smart glasses around 2025, and maybe AR contact lenses after that. And according to Kuo’s latest note to investors, seen by MacRumors, Apple may have larger plans for the headset. The company’s “goal is to replace the iPhone with AR in 10 years,” Kuo explains.
The headset is expected to feature Apple’s M1 processor and work as a stand-alone device. But it could also connect with Apple’s other devices. That’s not a surprising move. In fact, most of the reports on Apple’s headset seem to line right up with how VR is evolving: lighter-weight, with added mixed reality features via more advanced passthrough cameras. In that sense, Apple’s first headset will probably be a stepping stone to future lighter AR glasses, in the same way that Meta’s next headset, called Project Cambria, might be used.
Read more: Ready Shopper One: A metaverse gift guide
Last year, reports on Apple’s AR/VR roadmap suggested internal disagreements, or a split strategy that could mean a VR headset first, and more normal-looking augmented reality smart glasses later. But recent reports seem to be settling down to tell the story of a particular type of advanced VR product leading the way.
These reports have been going around for several years, including a story broken by CNET’s Shara Tibken in 2018. But the question is: When will this happen, exactly? 2022 or even later? Apple’s been building more advanced AR tools into its iPhones and iPads, setting the stage for something more. But we still don’t know what that thing (or things) is. What’s increasingly clear is that the rest of the AR/VR landscape is facing a slower-than-expected road to AR glasses, too.
VR, however, is a more easily reachable goal in the short term.
Apple has been in the wings all this time without any headset at all, although the company’s aspirations in AR have been clear and well-telegraphed on iPhones and iPads for years. Each year, Apple’s made significant strides on iOS with its AR tools. It’s been debated how soon this hardware will emerge: this year, the year after or even further down the road. Or whether Apple proceeds with just glasses, or with a mixed-reality VR/AR headset, too.
I’ve worn more AR and VR headsets than I can even recall, and been tracking the whole landscape for years. In a lot of ways, a future Apple AR headset’s logical flight path should be clear from just studying the pieces already laid out. Apple acquired VR media-streaming company NextVR in 2020, and previously purchased AR headset lens-maker Akonia Holographics in 2018.
I’ve had my own thoughts on what the long-rumored headset might be, and so far, the reports feel well-aligned to be just that. Much like the Apple Watch, which emerged among many other smartwatches and had a lot of features I’d seen in other forms before, Apple’s glasses will probably not be a massive surprise if you’ve been following the beats of the AR/VR landscape lately.
Remember Google Glass? How about Snapchat’s Spectacles? Or the HoloLens or Magic Leap? Meta is working on AR glasses too, and Snap… and also Niantic. The landscape could get crowded fast.
Here’s where Apple is likely to go based on what’s been reported, and how the company could avoid the pitfalls of those earlier platforms.
Apple declined to comment on this story.
An Apple VR headset could be a lot like Meta’s Quest… but higher-end
There’s already one well-polished success story in VR, and the Quest 2 looks to be as good a model as any for where future headsets could aim. Gurman’s report makes a potential Apple VR headset sound a lot like Facebook’s stand-alone device, with controller-free hand tracking and spatial room awareness that could be achieved with Apple’s lidar sensor technology, introduced on the iPad Pro and iPhone 12 Pro.
Apple’s headset could end up serving a more limited pro or creator crowd, or go for a mainstream focus on gaming… or fitness. My experiences with the Oculus Quest’s fitness tools feel like a natural direction for Apple to head in, now that the Apple Watch is extending to subscription fitness training, pairing with TVs and other devices.
The Oculus Quest 2 (now officially the Meta Quest 2) can see through to the real world and extend some level of overlap of virtual objects like room boundaries, but Apple’s headset could explore passthrough augmented reality to a greater degree. I’ve seen impressive examples of this in headsets from companies like Varjo. It could be a stepping stone for Apple to develop 3D augmented reality tech on smaller glasses designs down the road.
Right now, there aren’t any smart glasses manufacturers able to develop normal-looking glasses that can achieve advanced, spatially aware 3D overlays of holographic objects. Some devices like the nReal Light have tried, to mixed success. Meta’s first smart glasses, Ray-Ban Stories, weren’t AR at all. Meta is working on ways to achieve that tech later on. Apple might take a similar approach with glasses, too.
The VR headset may be a ‘Pro’ device
Most existing reports suggest Apple’s VR headset would likely be so expensive, and powerful, that it would aim for a limited crowd rather than the mainstream. If so, it could target the same business/creative zone that more advanced VR headsets like the Varjo XR-3 have already been aiming for.
I tried Varjo’s hardware: my experience with it could be a roadmap for what Apple’s headset could also aim for. It has a much higher-resolution display (which Apple is apparently going to try to achieve), can blend AR and VR into mixed reality using its passthrough cameras, and is designed for pro-level creative tools. Its lidar sensors could also be a sign of how Apple could integrate similar sensors.
But Varjo’s headset, and most pro VR headsets, are tethered to PCs with a number of cables. Apple’s headset could work as a stand-alone device, like the Quest 2, work on its own and also when connected to a Mac, much like the Quest 2 already does. Apple’s advantage could be making a pro headset that is a lot more lightweight and seamlessly standalone than any other current PC-ready gear. But what remains unknown is how many apps and tools Apple will be able to introduce to make its headset feel like a true Pro tool for creators.
Controls: Hand tracking, or a small worn device?
The Information’s previous reports on Apple’s headset suggest a more pared-down control system than the elaborate and large game controller-like peripherals used by many VR headsets right now. Apple’s headset should work using hand tracking, much like many VR and AR headsets already enable. But Apple would likely need some sort of controller-type accessory for inputs, too. Cracking the control/input challenge seems, in many ways, to be one of the bigger hurdles.
Could that controller be an Apple Watch? Possibly, but the Apple Watch’s motion control capabilities and touchscreen may not be enough for the deeper interactions an Apple headset would need. Maybe iPhones could pair and be used as controllers, too. That’s how Qualcomm is envisioning its next wave of phone-connected glasses.
Future AR smart glasses may also be in the works
Getting people to put on an AR headset is hard. I’ve found it a struggle to remember to pack smart glasses, and find room to carry them. Most of them don’t support my prescription, either. Developer-focused AR glasses made by Snap that I tried recently show where everyday AR glasses could go, but they’re still a work in progress.
Analyst Ming-Chi Kuo’s prediction of AR glasses being a few years after a VR/AR goggle-type headset would line up with what other companies are promising. The challenges with AR glasses are a lot greater than VR. No one’s figured out how wearing them all the time would work, or how you’d interact with virtual objects: hand tracking? A watch, or a ring? Voice? Neural inputs?
Apple always touted the Apple Watch, first and foremost, as a “great watch.” I expect the same from its glasses. If Apple makes prescription glasses and makes them available, Warby Parker-style, in seasonal frames from its Apple Stores, that might be enough for people if the frames look good. Apple’s VR headset, according to Gurman, will also offer prescription lenses. That could be a stepping stone to developing glasses later on.
Google acquired smart glasses maker North in 2020, which made a prescription, nearly normal-like pair of eyewear. North’s concept for glasses might be too similar to Google Glass for Apple’s tastes, but the idea of AR glasses doubling as functional glasses sounds extremely Apple-like. More recently, Vuzix’s planned smart glasses for 2021 show how far the tech has shrunken down, but even those planned glasses won’t have the ability to spatially scan the world and overlay augmented reality: They’ll be more like advanced glasses with heads-up displays and 3D audio.
A report from The Information from last year said new AR lenses were entering a trial production phase for Apple’s AR hardware (9to5Mac also broke the report down). These lenses sound much closer to normal glasses than current AR headsets allow, but when would those be ready?
Could Apple make its first smart glasses something more basic, letting Apple slowly add more AR features over time and let newcomers settle into the experience? Or would Apple try to crack the AR challenge with its first pair of glasses? Augmented reality is a weird concept for eyewear, and potentially off-putting. Maybe Apple will aim for subtlety. The original Apple Watch was designed to be glanced at for just five seconds at a time.
A recent patent filing also showed Apple looking to solve vision conditions with adaptive lenses. If true, this could be the biggest killer app of Apple’s intelligent eyewear.
Are the AirPods Max a sign of how expensive a headset could be?
Gurman’s report on Apple’s VR headset suggests a high price for the hardware. That makes sense from Apple. In 2020, a report from Apple leaker Jon Prosser said a product named the Apple Glass would start at $499 plus prescription add-ons such as lenses. That price sounds hard to believe for a new Apple product. Sure, the original iPad started at $500. The Apple Watch was around the same. But AR and VR hardware, with the exception of the Oculus Quest, is more complicated. The business-focused HoloLens and Magic Leap cost thousands of dollars. Current VR headsets have trended towards $500 or more.
The latest price reports from The Information have the cost pushing $3,000, which is in the territory of business-focused AR headsets like the HoloLens 2, or business creative VR headsets like those from Varjo. That lines the product up as a Mac Pro type device, well out of most people’s price range.
Apple’s headphones, the AirPods Max, should have indicated the pricing will climb high. At $549, they cost more than a PlayStation 5. They’re just headphones. A pair of smart glasses, or an advanced VR headset, would be a lot more advanced.
Qualcomm’s AR and VR plans have been telegraphing the next wave of headsets: Many of them will be driven by phones. Phone-powered glasses can be lower-weight and just have key onboard cameras and sensors to measure movement and capture information, while the phone does the heavy lifting and doesn’t drain headset battery life.
Apple’s star device is the iPhone, and it’s already loaded with advanced chipsets that can do tons of AR and computer vision computation. It could already handle powering an AR headset now; imagine what could happen in another year or two.
Apple could also have its own high-end M1 chip in its first wave of VR/AR headsets, as reports suggest, but they’ll also undoubtedly dovetail with more advanced processors in Apple’s phones, tablets, and Macs, too.
How Apple could blend the real world with AR and VR
Apple’s iOS 14 introduced QR code and NFC-enabled App Clips that can launch experiences from real-world locations with a tap or scan. These micro apps are made to work with AR, too: With glasses or an AR headset, they could eventually launch interactions at a glance.
Maybe QR codes can help accelerate AR working in the “dumb” world. Apple’s latest iPhones also have a mysterious U1 chip that can be used to improve accuracy in AR object placement, and also to more quickly locate other Apple devices that have the U1 chip, too.
Apple’s AirTags arrived last year with features similar to Samsung’s SmartTags Plus that use similar ultrawideband technology. These tags could be seen via an iPhone app using AR, which could possibly extend into Apple’s future VR/AR headsets. If all Apple’s objects recognize each other, they could act as beacons in a home. The U1 chips could also be indoor navigation tools for added precision.
Microsoft’s collaborative mixed-reality platform, Mesh, shows how meetings with people in virtual spaces could happen instantly and in work-like environments. Apple already enables multiperson AR in real places, but a necessary next step would be to allow a platform for collaboration in AR and VR like Microsoft is developing.
Apple’s depth-sensing hardware is already here
Apple is already deeply invested in camera arrays that can sense the world from short and long distances. The front-facing TrueDepth camera on every Face ID iPhone since the X is like a shrunken-down Microsoft Kinect and can scan a few feet out, sensing 3D information with high enough accuracy to be used for a secure face scan. Apple’s lidar technology on its recent iPhones and iPads can scan out much further, several meters away. That’s the range that glasses would need.
Apple’s existing lidar technology, combined with cameras, is already good enough to scan environments and 3D objects. iPadOS 15 uses the lidar scanner to build out even more advanced depth features and room-meshing, and looks like the missing link to build a new wave of even more realistic AR graphics from Apple.
Add to this the wider-scale lidar scanning Apple is doing in Maps to enable overlays of real-world locations with virtual objects via a technology called Location Anchors, and suddenly it seems like the depth-scanning Apple is introducing could expand to worldwide ambitions.
Apple’s new Mac chips already point toward VR/AR compatibility
The latest M1-enabled Macs are now a lot more like Apple’s iPhones and iPads, which means they’re technically a lot more capable of the power needed to run AR and VR. Developing a common groundwork across devices could allow a headset to feasibly run on an iPhone, iPad or Mac, making it a universal Apple device accessory.
That would be pretty essential if Apple intends on its VR or AR headsets to have any role in creative workflows, or be used for games or apps. It’s one of the limitations of existing VR headsets, which need to run off particular Windows gaming PCs, and still don’t play well with iOS or Android phones.
Look to AirPods for ease of use — and audio augmented reality
I’ve thought about how AirPods and their instant-on comfort, and weird design, was an early experiment on how wearing Apple’s hardware directly on our faces could be accepted and become normal. AirPods are expensive compared to in-box wired buds, but also utilitarian. They’re relaxed. Apple’s possible headsets would need to feel the same way.
The AirPod Pros’ spatial audio, which AirPods Max and Airpods 3 also have, points to where future ideas could head. Immersive audio is casual, and we do it all the time. Immersive video is hard and not always needed. I could see AR working as an audio-first approach, like a ping. Apple glasses could potentially do the world-scanning spatial awareness that would allow the spatial audio to work. In the meantime, Apple’s already developing the spatial audio tech that its VR headset would need.
Apple Watch and AirPods could be great companions
Apple’s already got a collection of wearable devices that connect with the iPhone, and both make sense with glasses. Its AirPods can pair for audio (although maybe the glasses have their own Bose Frames-like audio, too), while the watch could be a helpful remote control. The Apple Watch already acts as a remote at times, for the Apple TV, or linking up with the iPhone camera. Apple’s future headsets could also look to the watch and expand its display virtually, offering enhanced extras that show up discreetly, like a halo. Or use the watch as some sort of controller.
The Apple Watch could also provide something that it’ll be hard to get from hand gestures or touch-sensitive frames on a pair of glasses: haptics. The rumbling feedback on the Watch could lend some tactile response to virtual things, possibly.
Could Qualcomm and Apple’s reconciliation also be about XR?
Qualcomm and Apple are working together again on future iPhones, and I don’t think it’s just about modems. 5G is a key feature for phones, no doubt. But it’s also a killer element for next-gen AR and VR. Qualcomm has already been exploring how remote rendering could allow 5G-enabled phones and connected glasses to link up to streaming content and cloud-connected location data. Glasses could eventually stand on their own and use 5G to do advanced computing, in a way like the Apple Watch eventually working over cellular.
Qualcomm’s chipsets are in almost every self-contained AR and VR headset I can think of (Oculus Quest, HoloLens 2, a wave of new smart glasses, the latest version of Google Glass, Vive Focus). Will Apple’s tech dovetail at all with Qualcomm’s cross-device platforms?
Launch date: 2022, 2023… or later?
New Apple products tend to be announced months before they arrive, maybe even more. The iPhone, Apple Watch, HomePod and iPad all followed this path.
A report from The Information from 2019, based on purported leaked Apple presentational material, suggested 2022 for an Oculus Quest-like AR/VR headset, and 2023 for glasses. Maybe Apple takes a staggered strategy with AR, and releases several devices: one for creators first, with a higher price; and one for everyday wearers later.
Either way, developers would need a long head start to get used to developing for Apple’s glasses, and making apps work and flow with whatever Apple’s design guidance will be. That’s going to require Apple giving a heads-up on its hardware well in advance of its actual arrival. Maybe at next year’s WWDC.
We still don’t know anything more yet, but the future still seems farther off than we expected.