Opublikowano - Dodaj komentarz

Zero to One: Jak nasz niestandardowy krzem i chipy rewolucjonizują AR

In 2017, Reality Labs Chief Scientist Michael Abrash, backed by Meta Founder & CEO Mark Zuckerberg, established a new, secret team within what was then Oculus Research to build the foundation of the next computing platform. Their herculean task: Create a custom silicon solution that could support the unique demands of future augmented reality glasses—a technical feat that required reimagining every component for an all-day wearable AR glasses form factor that simply didn’t exist yet.

Front angle of Orion AR glasses prototype.

Compact Form Factor, Considerable Problem Space

Growing from just a few researchers to hundreds of people on the product side, the custom silicon team was built on the premise that AR glasses couldn’t rely on the silicon available in today’s smartphones. And it’s a premise borne out by the several custom chips inside Orion, our first true AR glasses product prototype.

“Building the ship as it sails out of the harbor—that was exactly what we were doing,” says Advanced Technologies Strategy Director Jeremy Snodgrass. “We had to scale up what was a modest team while building these chips simultaneously. It was fascinating to see the leadership bring new hires on board while developing a culture that values agility. Nothing was someone else’s problem. You’d pick up an oar and start rowing, even if it wasn’t exactly what you were hired to do. It was a very, very exciting time.”

“Almost everything about Orion was new to the world in a lot of ways,” agrees Display Architecture Technical Director Mike Yee. “Some of the ideas had been out there, but no one had taken on the research project to actually build an all-day wearable pair of AR glasses.”

The team had to deliver a compelling AR experience with as little power consumption as possible. The glasses form factor can only dissipate so much heat, and it can only accommodate so much battery capacity. As a result, the experiences it’s possible to deliver on the glasses are completely dependent upon the silicon. In other words, if you hold constant the thermal and battery capacities, the only way to deliver a given experience is to optimize the silicon.

“Designing a new architecture for Orion required the team to not only push existing technologies like wireless and displays aggressively but also to take risks on new technologies,” says Director of Product Management Neeraj Choubey. “For instance, the team developed a machine learning (ML) accelerator without a clear use case at the time, driven by the strong conviction that ML would become increasingly important in Meta’s products. On Orion, every ML accelerator is utilized, and in some cases, they’re oversubscribed, serving functions such as eye tracking and hand tracking. Similarly, the team developed custom compression protocols to reduce bandwidth and power consumption as data moved from the compute puck to the display. Developing custom silicon to achieve Orion’s form factor goals required both a tolerance for high ambiguity and meticulous attention to detail to deliver an incredibly complex system architecture.”

“The biggest challenge we had was delivering the 3D world-locked rendered graphics, along with spatialized audio that’s rendered such that it appears to be emanating from a virtual object,” notes Snodgrass. “We had to fit all these electronics into the thermal capacity and physical space, and then run it on the battery so that it doesn’t get too hot. And we had to do all that in an actual glasses form factor—not a big visor like you typically see in the category.”

“We knew how to deliver the necessary compute power to bring our vision for Orion to life, but we faced a daunting task: reducing power consumption by a factor of 100,” says Director of SoC Solutions Robert Shearer. “This required us to push the boundaries of silicon design, embracing methodologies from diverse corners of the industry—from IoT to high-performance computing—and inventing new approaches to bridge the gaps. Our industry partners thought we were crazy, and perhaps they weren’t entirely wrong. But that’s exactly what it took: a willingness to challenge conventional wisdom and rethink everything. Building a computer that seamlessly merges the virtual and physical worlds demands a profound understanding of context, far surpassing what existing compute platforms can offer. In essence, we’ve reinvented the way computers interact with humans, which meant reimagining how we build silicon from the ground up.”

Orion exterior components.

The Magic of MicroLEDs

There were moments when things fell behind schedule or a seemingly insurmountable technical challenge arose when it was hard to maintain momentum and morale. But the team was resilient, finding paths around obstacles—or simply knocking them down.

Take, for example, Orion’s display. The silicon team was responsible for the silicon in the display projector that sits in the glasses’ corners.

“For those projectors, there was an open question as to whether we could get the microLEDs in an array at high enough efficiency and brightness to deploy a wide field of view display,” Snodgrass says. “There was tremendous doubt that we could—that it was possible in the time frame we were considering—because this was very nascent technology.”

“We realized very early on that we had to rethink a lot of the paradigms of product development,” adds Yee. “The amount of light that you need to make a usable display is quite a bit brighter for AR glasses because, as a wearable display, you’re competing with the sun. So we need energy levels that rival that—or at least that’s the goal. We’re not quite there yet, but that’s a big piece of it. And that means you need light sources for a display that are capable of that and you need circuitry that can control that. And at the same time, you need to make it tiny.”

Custom silicon driving Orion’s µLEDs.

While microLEDs seemed the most suitable light source for the projectors, silicon helped unlock their potential.

“With displays, we talk about pixel pitches, which are the distances between the center points of adjacent pixels,” Yee explains. “For TVs, those distances are hundreds of microns. In your phone, there are many, many tens of microns. And we needed to get that down into the single digits. The only known semiconductor manufacturing that could get there was silicon.”

Complicating the work was the fact that the back surface of the display had to be a piece of silicon, and no one in the world had been designing silicon for microLEDs.

“At the time, the research teams that were out there had all repurposed liquid crystal on silicon displays to put microLEDs on,” says Yee. “Nobody had ever designed a backplane for microLEDs before. And we faced a pretty unique challenge because it’s an optical component. It has to be flat. You can’t scratch it. It has to have all these characteristics because when you look through the waveguides, through the projectors, you’re literally looking at the top surface of the piece of silicon.”

The silicon team developed a complex series of test platforms for these microLED displays that involved coordinating closely with our global suppliers. The microLEDs have a global footprint, originating in one spot and then transferred to another site where they were put onto a wafer. The wafers were then shipped off to be cut into a particular shape, followed by a trip to the US to be bonded together with another wafer, then shipped back across the globe for the actual module to be built and tested. It was an enormously complicated process, and the silicon team developed test vehicles to prove out each step.

The team also needed to find a way to deliver power to microLED displays in the tiny volume of the glasses’ corners. Our analog team developed a custom power management chip that fit inside that volume.

“Power delivery is crucial for such small form factor wearable devices, where battery size is limited and space is at a premium,” notes Director of Analog and Mixed Signal Systems Jihong Ren. “Our customized Power Management IC solution leverages cutting-edge technologies to optimize power efficiency for our specific workload at the system level, all while fitting within the available space constraints. Achieving this optimal solution required close interdisciplinary collaboration with our mechanical, electrical, SoC, μLED, and thermal teams, ensuring a seamless integration of all components and maximizing overall performance.”

“That was an amazing feat of not just engineering but also organizational management: bringing a team together, working across time zones and with all these different vendors,” adds Snodgrass. “In some ways, managing all that organizationally was just as much of a challenge as meeting the technical specs.”

“Not only is the design custom, the entire fabrication process is customized,” adds Yee. “We’re fortunate to have some wonderful partners in the industry that helped make that happen. They see long-term potential in AR displays as a whole and certainly Meta’s vision for it. So they’ve been willing to partner with us to make those customizations and optimizations happen to enable that display.”

Orion AR glasses.

Iteration Meets Acceleration

There was a tight feedback loop between the silicon team and the brilliant minds in Reality Labs Research and XR Tech developing algorithms. The latter teams would provide these algorithms, which the former would translate into hardware, removing the overhead of general-purpose CPU-running software. That meant that the algorithms would run at lower power, but it also meant the silicon team was locked in. Once the algorithms were hardened, they could no longer make changes.

“Say XR Tech was developing a certain discipline algorithmically,” explains Silicon Accelerators Architecture and Algorithms Director Ohad Meitav. “They own the algorithmic stack and its performance. My team, in collaboration with them, would then decide how to accelerate the algorithm, how to harden parts of the algorithm, and how to actually put it into the hardware in a way that runs super efficiently. Then XR Tech would adapt their software stack to account for the hardware. It’s a very iterative process.”

Another success story is the silicon team’s collaboration with Reality Labs Research to come up with a novel reprojection algorithm.

“We needed the reprojection algorithm to support a variety of different distortions and corrections,” notes Silicon Architect Steve Clohset. “The algorithm RL-R developed that we ended up using is not used in general computing. And to this day, it’s proving to be a pretty powerful tool.”

Once the algorithms were hardened and the hardware was optimized, the silicon bring-up team put the custom chips through their paces.

“Orion’s custom silicon chipset is packed with complexity,” says Senior Director of End-to-End System and Infrastructure Liping Guo. “Bringing-up and validating standalone chips and the interoperabilities across them within a short period of time is incredibly challenging. Luckily, we’re operating within a vertically integrated environment where Reality Labs owns the entire stack—from the silicon and low-level firmware to the operating system, software, and the upper-layer experiences. We took full advantage of this, working closely with our cross-functional partners, and shift-left our cross-stack integration at the silicon validation stage. Orion was our pilot run for this methodology—we built our shift-left muscles and created a strong foundation for Reality Labs to reap the full benefits of custom silicon in the future.”

And after bring-up, it was time to optimize for software.

“There’s an iterative process where you start with a wholly unoptimized software stack, just to get everything booted up and running,” says Snodgrass. “And then, one by one, you go through the subsystems and start to optimize the software for that specific hardware—including reducing the amount of memory that the software uses. The hardware could be beautifully designed, but you won’t achieve the theoretical power efficiency unless you invest as much or more time getting the software to take full advantage of the hardware. So that’s the story of Orion: hardware and software optimized to the hilt. Leave no picojoule or milliwatt behind.”

And while Orion may be a prototype, the work that went into it has considerable potential to influence Meta’s roadmap.

“We look at the silicon IPs that we’re building as platforms in the sense that these are valued IPs that we’ll improve upon from one generation or one product to another,” adds Meitav. “All of the computer vision and graphics algorithms aren’t just built for Orion. They’ll go on to inform future products.”

Our silicon work involves creating novel solutions while also working closely with partners. In fact, the silicon team’s influence extends beyond Orion to both Okulary Ray-Ban Meta oraz Meta Quest headsets today—even though they both use third-party chips. The silicon team regularly shares their work with the mixed reality team, showing what’s possible in terms of power efficiency. The MR team then shares those findings with partners like Qualcomm to help inform future chip designs. And because the silicon team uses the same off-the-shelf digital signal processors (DSPs) as Ray-Ban Meta glasses, they’ve been able to share lessons learned and best practices when implementing and writing code for those DSPs to help improve the audio experiences available on our AI glasses.

And that knowledge sharing goes both ways: The MR team has given the silicon team insights on things like Asynchronous TimeWarp oraz Application SpaceWarp in production.

“What a person does in production is far more interesting than something that we could do with a prototype,” says Clohset. “We tried to integrate what they were doing with warpings as much as possible.”

Ambiguity < Ambition

Because Orion was truly zero to one, the teams involved had to deal with an inordinate amount of ambiguity by necessity.

“With Orion, I can’t overstate how complicated the ambiguity made things,” says Clohset. “When you make a computer, for example, you generally have a good idea of what the display will be. But we didn’t know what the waveguide would end up being, so we had to go sample different waveguides and come up with a mechanism that could handle the worst-case scenario because we didn’t know where things would land. One optimization here would convolve with all these other choices, and you would end up with this matrix of like all these different things that you had as support and try to validate because you didn’t know where the product would land in six months.”

It’s important to note that Orion isn’t just a pair of AR glasses—it’s a three-part constellation of hardware. A lot of the processing happens on the compute puck, which necessitates a strong connection between it and the glasses. Add the surface EMG wristband into the loop, and the system architecture becomes even more complicated.

“That was enormously challenging for the teams to solve, and all of it just works” says Snodgrass. “It was an amazing collaboration between the silicon team, the wireless team, and software teams all across the organization.”

Orion’s compute puck.

“With Orion, we built out a whole team with a broad diversity of engineers, and they were able to design a brand-new pipeline,” adds Clohset. “It’s a pipeline that handles six degrees of freedom movement of objects in 3D space. It uses its own custom display driver. We had to do really unique image quality corrections. And I think the challenge for us was that, because this was a zero-to-one project, there was no existing spec to tweak and improve. Everything here is completely new, so we had latitude to do everything.”

Much like the puck has some dormant features hidden beneath the surface, the custom silicon punches above its weight, too. While Orion doesn’t allow the user to take photos with its RGB cameras, the silicon is capable of supporting it, as well as codec avatars. And just as the puck helped unlock a true glasses form factor by offloading much of the compute, Orion’s custom silicon proved a necessary piece of the AR puzzle.

“To bring a zero-to-one experience like AR glasses to life, you need custom silicon—full stop,” Snodgrass explains. “Over time, if there’s a market, then silicon vendors will develop products to meet the demand. But for zero-to-one, you can’t just take something that exists off the shelf that’s intended for a different product and fit it into a new mold. You have to invest in something custom. And to bring those zero-to-one experiences to life, you need broad collaboration between software partners, industrial designers, mechanical engineers, and more.”

“By breaking free from traditional mental models, we’ve created something truly remarkable,” adds Shearer. “We believe that this computing platform represents the future of technology—one that will revolutionize the way we live, work, and interact with each other. We’re excited to be at the forefront of this innovation, pushing the boundaries of what’s possible and helping to shape the course of history.”


For more information on Orion, check out these blog posts:

Dodaj komentarz

Twój adres e-mail nie zostanie opublikowany. Wymagane pola są oznaczone *