Meta Reality Labs’ Chief Scientist Michael Abrash laid out the company’s analysis of how contemporary compute architectures will need to evolve to make possible the AR glasses of our sci-fi conceptualizations.
While there’s some AR ‘glasses’ on the market today, none of them are truly the size of a normal pair of glasses (even a bulky pair). The best AR headsets available today—the likes of HoloLens 2 and Magic Leap 2—are still closer to goggles than glasses and are too heavy to be worn all day (not to mention the looks you’d get from the crowd).
If we’re going to build AR glasses that are truly glasses-sized, with all-day battery life and the features needed for compelling AR experiences, it’s going to take require a “range of radical improvements—and in some cases paradigm shifts—in both hardware […] and software,” says Michael Abrash, Chief Scientist at Reality Labs, Meta’s XR organization.
That is to say: Meta doesn’t believe that its current technology—or anyone’s for that matter—is capable of delivering those sci-fi glasses that every AR concept video envisions.
But, the company thinks it knows where things need to head in order for that to happen.
Abrash, speaking at the IEDM 2021 conference late last year, laid out the case for a new compute architecture that could meet the needs of truly glasses-sized AR devices.
Follow the Power
The core reason to rethink how computing should be handled on these devices comes from a need to drastically reduce power consumption to meet battery life and heat requirements.
“How can we improve the power efficiency [of mobile computing devices] radically by a factor of 100 or even 1,000?” he asks. “That will require a deep system-level rethinking of the full stack, with end-to-end co-design of hardware and software. And the place to start that rethinking is by looking at where power is going today.”
To that end, Abrash laid out a graph comparing the power consumption of low-level computing operations.
Compute Where You Least Expect It
A starting point for a distributed computing architecture, Abrash says, could begin with the many cameras that AR glasses need for sensing the world around the user. This would involve doing some preliminary computation on the camera sensor itself before sending only the most vital data across power hungry data transfer lanes.