One striking realization about spatial computing is that we’re almost seven years into the sector’s current stage. This traces back to Facebook’s Oculus acquisition in early 2014 that kicked off the current wave of excitement….including lots of ups and downs in the intervening years.

That excitement culminated in 2016 after the Oculus acquisition had time to set off a chain reaction of startup activity, tech-giant investment, and VC inflows for the “next computing platform.” But when technical and practical realities caught up with spatial computing….it began to retract.

Like past tech revolutions — most memorably, the dot com boom/bust — spatial computing has followed a common pattern. Irrational exuberance is followed by retraction, market correction, and scorched earth. But then a reborn industry sprouts from those ashes and grows at a realistic pace.

That’s where we now sit in spatial computing’s lifecycle. It’s not the revolutionary platform shift touted circa-2016. And it’s not a silver bullet for everything we do in life and work as once hyped. But it will be transformative in narrower ways, and within a targeted set of use cases and verticals.

This is the topic of ARtillery’s recent report, Spatial Computing: 2020 Lessons, 2021 Outlook. Key questions include, what did we learn in the past year? What are projections for the coming year? And where does spatial computing — and its many subsegments — sit in its lifecycle?

Trending AR VR Articles:

1. How VR could bring transhumanism to the masses

2. How Augmented Reality (AR) is Reshaping the Food Service Industry

3. ExpiCulture — Developing an Original World-Traveling VR Experience

4. Enterprise AR: 7 real-world use cases for 2021

Lite AR

Picking up where we left off in the last installment of this series, Apple could inflect the AR glasses market when it enters. The question is what will those glasses be and do? We don’t know for sure, but clues point to a likelihood that Apple will eschew common AR connotations.

In other words, Apple won’t launch AR glasses — at least in V1 — that employ “heavy AR.” This is world-immersive AR that has spatial and semantic understanding of its surroundings. It’s all about graphics that populate your field of vision in dimensionally accurate ways.

To achieve these functions, there are design tradeoffs such as bulk and heat, which would deviate from Apple’s style and design sensibilities. So in the sliding scale between sleek glasses that power “light AR”; and bulky hardware that powers “heavy AR,” Apple will choose the former.

The first clue for this theory is the state of the underlying technology. It’s not to the point where sleekness and graphical intensity are possible in the same device. The second clue comes from Apple’s market size and resulting fiduciary drive to pursue massive markets.

TAM Right

With that backdrop, “lite” AR glasses have a larger addressable market (TAM) than bulky sensor-heavy ones. The latter appeals to a subset of technophiles. Apple’s mass-market requirements leads it to something more along the lines of corrective eyewear or sunglasses.

Indeed, eyeglasses and sunglasses are much larger markets than AR glasses. So Apple could enter the $200 billion corrective eyewear market. AR features could include line-of-sight notifications that integrate other Apple apps, or biometrics from your Apple Watch.

Moreover, Apple will broaden the concept of “augmentation” beyond current connotations. So instead of cartoon monsters, digital “layers” will be things that generally help people see better — either in a corrective sense or with digital filters that brighten your day in various ways.

Other clues indicate practical mass-market functions, such as the integration with Apple’s “project Gobi.” This involves retail point-of-sale codes that unlock product promotions or Apple Pay. This has mass-market applicability and aligns with post-Covid “touchless” retail.

First Steps

Apple Glass could also have spatial audio integration with AirPods Pro. This could involve an audible “notification layer” that joins its visual counterpart. Use cases could include identifying people or real-time foreign language translation. These could be true killer apps.

But some of those use cases could be further in the future. In that way, Apple’s AR glasses should be viewed as a long-evolutionary road, sort of like the iPhone has been. That could make version 1 of the device a starting point for Apple’s looming era of sensory augmentation.

Put another way, “lite AR” is Apple’s first step into AR glasses. Like the iPhone 1’s long evolutionary path to the pocket supercomputer we know today, “Apple Glass” will grow from simple augmentation to (eventually) the AR that’s today the stuff of science fiction.

Don’t forget to give us your 👏 !


Will ‘Apple Glass’ Redefine AR? was originally published in AR/VR Journey: Augmented & Virtual Reality Magazine on Medium, where people are continuing the conversation by highlighting and responding to this story.