Your cart is currently empty!
Author: Kristián Ivančo
-
Zero to One: How Our Custom Silicon & Chips Are Revolutionizing AR
In 2017, Reality Labs Chief Scientist Michael Abrash, backed by Meta Founder & CEO Mark Zuckerberg, established a new, secret team within what was then Oculus Research to build the foundation of the next computing platform. Their herculean task: Create a custom silicon solution that could support the unique demands of future augmented reality glasses—a technical feat that required reimagining every component for an all-day wearable AR glasses form factor that simply didn’t exist yet.
Front angle of Orion AR glasses prototype.
Compact Form Factor, Considerable Problem Space
Growing from just a few researchers to hundreds of people on the product side, the custom silicon team was built on the premise that AR glasses couldn’t rely on the silicon available in today’s smartphones. And it’s a premise borne out by the several custom chips inside Orion, our first true AR glasses product prototype.
“Building the ship as it sails out of the harbor—that was exactly what we were doing,” says Advanced Technologies Strategy Director Jeremy Snodgrass. “We had to scale up what was a modest team while building these chips simultaneously. It was fascinating to see the leadership bring new hires on board while developing a culture that values agility. Nothing was someone else’s problem. You’d pick up an oar and start rowing, even if it wasn’t exactly what you were hired to do. It was a very, very exciting time.”
“Almost everything about Orion was new to the world in a lot of ways,” agrees Display Architecture Technical Director Mike Yee. “Some of the ideas had been out there, but no one had taken on the research project to actually build an all-day wearable pair of AR glasses.”
The team had to deliver a compelling AR experience with as little power consumption as possible. The glasses form factor can only dissipate so much heat, and it can only accommodate so much battery capacity. As a result, the experiences it’s possible to deliver on the glasses are completely dependent upon the silicon. In other words, if you hold constant the thermal and battery capacities, the only way to deliver a given experience is to optimize the silicon.
“Designing a new architecture for Orion required the team to not only push existing technologies like wireless and displays aggressively but also to take risks on new technologies,” says Director of Product Management Neeraj Choubey. “For instance, the team developed a machine learning (ML) accelerator without a clear use case at the time, driven by the strong conviction that ML would become increasingly important in Meta’s products. On Orion, every ML accelerator is utilized, and in some cases, they’re oversubscribed, serving functions such as eye tracking and hand tracking. Similarly, the team developed custom compression protocols to reduce bandwidth and power consumption as data moved from the compute puck to the display. Developing custom silicon to achieve Orion’s form factor goals required both a tolerance for high ambiguity and meticulous attention to detail to deliver an incredibly complex system architecture.”
“The biggest challenge we had was delivering the 3D world-locked rendered graphics, along with spatialized audio that’s rendered such that it appears to be emanating from a virtual object,” notes Snodgrass. “We had to fit all these electronics into the thermal capacity and physical space, and then run it on the battery so that it doesn’t get too hot. And we had to do all that in an actual glasses form factor—not a big visor like you typically see in the category.”
“We knew how to deliver the necessary compute power to bring our vision for Orion to life, but we faced a daunting task: reducing power consumption by a factor of 100,” says Director of SoC Solutions Robert Shearer. “This required us to push the boundaries of silicon design, embracing methodologies from diverse corners of the industry—from IoT to high-performance computing—and inventing new approaches to bridge the gaps. Our industry partners thought we were crazy, and perhaps they weren’t entirely wrong. But that’s exactly what it took: a willingness to challenge conventional wisdom and rethink everything. Building a computer that seamlessly merges the virtual and physical worlds demands a profound understanding of context, far surpassing what existing compute platforms can offer. In essence, we’ve reinvented the way computers interact with humans, which meant reimagining how we build silicon from the ground up.”
Orion exterior components.
The Magic of MicroLEDs
There were moments when things fell behind schedule or a seemingly insurmountable technical challenge arose when it was hard to maintain momentum and morale. But the team was resilient, finding paths around obstacles—or simply knocking them down.
Take, for example, Orion’s display. The silicon team was responsible for the silicon in the display projector that sits in the glasses’ corners.
“For those projectors, there was an open question as to whether we could get the microLEDs in an array at high enough efficiency and brightness to deploy a wide field of view display,” Snodgrass says. “There was tremendous doubt that we could—that it was possible in the time frame we were considering—because this was very nascent technology.”
“We realized very early on that we had to rethink a lot of the paradigms of product development,” adds Yee. “The amount of light that you need to make a usable display is quite a bit brighter for AR glasses because, as a wearable display, you’re competing with the sun. So we need energy levels that rival that—or at least that’s the goal. We’re not quite there yet, but that’s a big piece of it. And that means you need light sources for a display that are capable of that and you need circuitry that can control that. And at the same time, you need to make it tiny.”
Custom silicon driving Orion’s µLEDs.
While microLEDs seemed the most suitable light source for the projectors, silicon helped unlock their potential.
“With displays, we talk about pixel pitches, which are the distances between the center points of adjacent pixels,” Yee explains. “For TVs, those distances are hundreds of microns. In your phone, there are many, many tens of microns. And we needed to get that down into the single digits. The only known semiconductor manufacturing that could get there was silicon.”
Complicating the work was the fact that the back surface of the display had to be a piece of silicon, and no one in the world had been designing silicon for microLEDs.
“At the time, the research teams that were out there had all repurposed liquid crystal on silicon displays to put microLEDs on,” says Yee. “Nobody had ever designed a backplane for microLEDs before. And we faced a pretty unique challenge because it’s an optical component. It has to be flat. You can’t scratch it. It has to have all these characteristics because when you look through the waveguides, through the projectors, you’re literally looking at the top surface of the piece of silicon.”
The silicon team developed a complex series of test platforms for these microLED displays that involved coordinating closely with our global suppliers. The microLEDs have a global footprint, originating in one spot and then transferred to another site where they were put onto a wafer. The wafers were then shipped off to be cut into a particular shape, followed by a trip to the US to be bonded together with another wafer, then shipped back across the globe for the actual module to be built and tested. It was an enormously complicated process, and the silicon team developed test vehicles to prove out each step.
The team also needed to find a way to deliver power to microLED displays in the tiny volume of the glasses’ corners. Our analog team developed a custom power management chip that fit inside that volume.
“Power delivery is crucial for such small form factor wearable devices, where battery size is limited and space is at a premium,” notes Director of Analog and Mixed Signal Systems Jihong Ren. “Our customized Power Management IC solution leverages cutting-edge technologies to optimize power efficiency for our specific workload at the system level, all while fitting within the available space constraints. Achieving this optimal solution required close interdisciplinary collaboration with our mechanical, electrical, SoC, μLED, and thermal teams, ensuring a seamless integration of all components and maximizing overall performance.”
“That was an amazing feat of not just engineering but also organizational management: bringing a team together, working across time zones and with all these different vendors,” adds Snodgrass. “In some ways, managing all that organizationally was just as much of a challenge as meeting the technical specs.”
“Not only is the design custom, the entire fabrication process is customized,” adds Yee. “We’re fortunate to have some wonderful partners in the industry that helped make that happen. They see long-term potential in AR displays as a whole and certainly Meta’s vision for it. So they’ve been willing to partner with us to make those customizations and optimizations happen to enable that display.”
Orion AR glasses.
Iteration Meets Acceleration
There was a tight feedback loop between the silicon team and the brilliant minds in Reality Labs Research and XR Tech developing algorithms. The latter teams would provide these algorithms, which the former would translate into hardware, removing the overhead of general-purpose CPU-running software. That meant that the algorithms would run at lower power, but it also meant the silicon team was locked in. Once the algorithms were hardened, they could no longer make changes.
“Say XR Tech was developing a certain discipline algorithmically,” explains Silicon Accelerators Architecture and Algorithms Director Ohad Meitav. “They own the algorithmic stack and its performance. My team, in collaboration with them, would then decide how to accelerate the algorithm, how to harden parts of the algorithm, and how to actually put it into the hardware in a way that runs super efficiently. Then XR Tech would adapt their software stack to account for the hardware. It’s a very iterative process.”
Another success story is the silicon team’s collaboration with Reality Labs Research to come up with a novel reprojection algorithm.
“We needed the reprojection algorithm to support a variety of different distortions and corrections,” notes Silicon Architect Steve Clohset. “The algorithm RL-R developed that we ended up using is not used in general computing. And to this day, it’s proving to be a pretty powerful tool.”
Once the algorithms were hardened and the hardware was optimized, the silicon bring-up team put the custom chips through their paces.
“Orion’s custom silicon chipset is packed with complexity,” says Senior Director of End-to-End System and Infrastructure Liping Guo. “Bringing-up and validating standalone chips and the interoperabilities across them within a short period of time is incredibly challenging. Luckily, we’re operating within a vertically integrated environment where Reality Labs owns the entire stack—from the silicon and low-level firmware to the operating system, software, and the upper-layer experiences. We took full advantage of this, working closely with our cross-functional partners, and shift-left our cross-stack integration at the silicon validation stage. Orion was our pilot run for this methodology—we built our shift-left muscles and created a strong foundation for Reality Labs to reap the full benefits of custom silicon in the future.”
And after bring-up, it was time to optimize for software.
“There’s an iterative process where you start with a wholly unoptimized software stack, just to get everything booted up and running,” says Snodgrass. “And then, one by one, you go through the subsystems and start to optimize the software for that specific hardware—including reducing the amount of memory that the software uses. The hardware could be beautifully designed, but you won’t achieve the theoretical power efficiency unless you invest as much or more time getting the software to take full advantage of the hardware. So that’s the story of Orion: hardware and software optimized to the hilt. Leave no picojoule or milliwatt behind.”
And while Orion may be a prototype, the work that went into it has considerable potential to influence Meta’s roadmap.
“We look at the silicon IPs that we’re building as platforms in the sense that these are valued IPs that we’ll improve upon from one generation or one product to another,” adds Meitav. “All of the computer vision and graphics algorithms aren’t just built for Orion. They’ll go on to inform future products.”
Our silicon work involves creating novel solutions while also working closely with partners. In fact, the silicon team’s influence extends beyond Orion to both Ray-Ban Meta glasses and Meta Quest headsets today—even though they both use third-party chips. The silicon team regularly shares their work with the mixed reality team, showing what’s possible in terms of power efficiency. The MR team then shares those findings with partners like Qualcomm to help inform future chip designs. And because the silicon team uses the same off-the-shelf digital signal processors (DSPs) as Ray-Ban Meta glasses, they’ve been able to share lessons learned and best practices when implementing and writing code for those DSPs to help improve the audio experiences available on our AI glasses.
And that knowledge sharing goes both ways: The MR team has given the silicon team insights on things like Asynchronous TimeWarp and Application SpaceWarp in production.
“What a person does in production is far more interesting than something that we could do with a prototype,” says Clohset. “We tried to integrate what they were doing with warpings as much as possible.”
Ambiguity < Ambition
Because Orion was truly zero to one, the teams involved had to deal with an inordinate amount of ambiguity by necessity.
“With Orion, I can’t overstate how complicated the ambiguity made things,” says Clohset. “When you make a computer, for example, you generally have a good idea of what the display will be. But we didn’t know what the waveguide would end up being, so we had to go sample different waveguides and come up with a mechanism that could handle the worst-case scenario because we didn’t know where things would land. One optimization here would convolve with all these other choices, and you would end up with this matrix of like all these different things that you had as support and try to validate because you didn’t know where the product would land in six months.”
It’s important to note that Orion isn’t just a pair of AR glasses—it’s a three-part constellation of hardware. A lot of the processing happens on the compute puck, which necessitates a strong connection between it and the glasses. Add the surface EMG wristband into the loop, and the system architecture becomes even more complicated.
“That was enormously challenging for the teams to solve, and all of it just works” says Snodgrass. “It was an amazing collaboration between the silicon team, the wireless team, and software teams all across the organization.”
Orion’s compute puck.
“With Orion, we built out a whole team with a broad diversity of engineers, and they were able to design a brand-new pipeline,” adds Clohset. “It’s a pipeline that handles six degrees of freedom movement of objects in 3D space. It uses its own custom display driver. We had to do really unique image quality corrections. And I think the challenge for us was that, because this was a zero-to-one project, there was no existing spec to tweak and improve. Everything here is completely new, so we had latitude to do everything.”
Much like the puck has some dormant features hidden beneath the surface, the custom silicon punches above its weight, too. While Orion doesn’t allow the user to take photos with its RGB cameras, the silicon is capable of supporting it, as well as codec avatars. And just as the puck helped unlock a true glasses form factor by offloading much of the compute, Orion’s custom silicon proved a necessary piece of the AR puzzle.
“To bring a zero-to-one experience like AR glasses to life, you need custom silicon—full stop,” Snodgrass explains. “Over time, if there’s a market, then silicon vendors will develop products to meet the demand. But for zero-to-one, you can’t just take something that exists off the shelf that’s intended for a different product and fit it into a new mold. You have to invest in something custom. And to bring those zero-to-one experiences to life, you need broad collaboration between software partners, industrial designers, mechanical engineers, and more.”
“By breaking free from traditional mental models, we’ve created something truly remarkable,” adds Shearer. “We believe that this computing platform represents the future of technology—one that will revolutionize the way we live, work, and interact with each other. We’re excited to be at the forefront of this innovation, pushing the boundaries of what’s possible and helping to shape the course of history.”
For more information on Orion, check out these blog posts:
-
Accelerating the Future
When we first began giving demos of Orion early this year I was reminded of a line that you hear a lot at Meta—in fact it was even in our first letter to prospective shareholders back in 2012: Code wins arguments. We probably learned as much about this product space from a few months of real-life demos than we did from the years of work it took to make them. There is just no substitute for actually building something, putting it in people’s hands, and learning from how they react to it.
Orion wasn’t our only example of that this year. Our mixed reality hardware and AI glasses have both reached a new level of quality and accessibility. The stability of those platforms allows our software developers to move much faster on everything from operating systems to new AI features. This is how I see the metaverse starting to come into greater focus and why I’m so confident that the coming year will be the most important one in the history of Reality Labs.
https://youtube.com/watch?v=xwMNZb6UE1o%3F%26controls%3D0%3Fautoplay%3D0%26color%3Dwhite%26cc_lang_pref%3Den%26cc_load_policy%3D0%26enablejsapi%3D1%26frameborder%3D0%26hl%3Den%26rel%3D0%26origin%3Dhttps%253A%252F%252Fwww.meta.com
2024 was the year AI glasses hit their stride. When we first started making smart glasses in 2021 we thought they could be a nice first step toward the AR glasses we eventually wanted to build. While mixed reality headsets are on track to becoming a general purpose computing platform much like today’s PCs, we saw glasses as the natural evolution of today’s mobile computing platforms. So we wanted to begin learning from the real world as soon as possible.
The biggest thing we’ve learned is that glasses are by far the best form factor for a truly AI-native device. In fact they might be the first hardware category to be completely defined by AI from the beginning. For many people, glasses are the place where an AI assistant makes the most sense, especially when it’s a multimodal system that can truly understand the world around you.
We’re right at the beginning of the S-curve for this entire product category, and there are endless opportunities ahead. One of the things I’m most excited about for 2025 is the evolution of AI assistants into tools that don’t just respond to a prompt when you ask for help but can become a proactive helper as you go about your day. At Connect we showed how Live AI on glasses can become more of a real-time participant when you’re getting things done. As this feature begins rolling out in early access this month we’ll see the first step toward a new kind of personalized AI assistant.
It won’t be the last. We’re currently in the middle of an industry-wide push to make AI-native hardware. You’re seeing phone and PC makers scramble to rebuild their products to put AI assistants at their core. But I think a much bigger opportunity is to make devices that are AI-native from the start, and I’m confident that glasses are going to be the first to get there. Meta’s Chief Research Scientist Michael Abrash has been talking about the potential of a personalized, context-aware AI assistant on glasses for many years, and building the technologies that make it possible has been a huge focus of our research teams.
Mixed reality has been another place where having the right product in the hands of a lot of people has been a major accelerant for progress. We’ve seen Meta Quest 3 get better month after month as we continue to iterate on the core system passthrough, multitasking, spatial user interfaces, and more. All these gains extended to Quest 3S the moment it launched. This meant the $299 Quest 3S was in many respects a better headset on day 1 than the $499 Quest 3 was when it first launched in 2023. And the entire Quest 3 family keeps getting better with every update.
In 2025 we’ll see the next iteration of this as Quest 3S brings a lot more people into mixed reality for the first time. While Quest 3 has been a hit among people excited to own the very best device on the market, Quest 3S is expected to be heavily gifted in 2024. Sales were strong over the Black Friday weekend, and we’re expecting a surge of people activating their new headsets over the holiday break.
This will continue a trend that took shape over the last year: growth in new users who are seeking out a wider range of things to do with their headsets. We want these new users to see the magic of MR and stick around for the long run, which is why we’re funding developers to build the new types of apps and games they’re looking for.
There’s been a significant influx of younger people in particular, and they have been gravitating toward social and competitive multiplayer games, as well as freemium content. And titles like Skydance’s BEHEMOTH, Batman: Arkham Shadow (which just won Best AR/VR game at The Game Awards!), and Metro Awakening showed once again that some of the best new games our industry produces are now only possible on today’s headsets.
As the number of people in MR grows, the quality of the social experience it can deliver is growing in tandem. This is at the heart of what Meta is trying to achieve with Reality Labs and where the greatest potential of the metaverse will be unlocked: “the chance to create the most social platform ever” is how we described it when we first began working on it. We took two steps forward on this front in 2024: first with a broad set of improvements to Horizon Worlds including its expansion to mobile, and second with the next-generation Meta Avatars system that lets people represent themselves across our apps and headsets. And as the visual quality and overall experience with these systems improve, more people are getting their first glimpse of a social metaverse. We’re seeing similar trends with new Quest 3S users spending more time in Horizon Worlds, making it a Top 3 immersive app for Quest 3S, and people continue creating new Meta Avatars across mobile and MR.
Getting mixed reality into the mainstream has helped illuminate what will come next. One of the first trends that we discovered after the launch of Quest 3 last year was people using mixed reality to watch videos while multitasking in their home—doing the dishes or vacuuming their living rooms. This was an early signal that people love having a big virtual screen that you can take anywhere and place in the physical world around you. We’ve seen that trend take off with all sorts of entertainment experiences growing fast across the whole Quest 3 family. New features like YouTube Co-Watch show how much potential there is for a whole new kind of social entertainment experience in the metaverse.
That’s why James Cameron, one of the most technologically innovative storytellers of our lifetimes, is now working to help more filmmakers and creators produce great 3D content for Meta Quest. While 3D films have been produced for decades, there’s never been a way to view them that’s quite as good as an MR headset, and next year more people will own a headset than ever before. “We’re at a true, historic inflection point,”Jim said when we launched our new partnership this month.
This is happening alongside a larger shift Mark Rabkin shared at Connect this year: Our vision for Horizon OS is to build a new kind of general purpose computing platform capable of running every kind of software, supporting every kind of user, and open to every kind of creator and developer. Our recent releases for 2D/3D multi-tasking, panel positioning, better hand tracking, Windows Remote Desktop integration, and Open Store have started to build momentum along this new path. Horizon OS is on track to be the first platform that supports the full spectrum of experiences from immersive VR to 2D screens, mobile apps, and virtual desktops—and the developer community is central to that success.
The next big step toward the metaverse will be combining AI glasses with the kind of true augmented reality experience we revealed this year with Orion. It’s not often that you get a glimpse of the future and see a totally new technology that shows you where things are heading. The people who saw Orion immediately understood what it meant for the future much like the people who saw first personal computers taking shape at Xerox PARC in the 1970s (“within ten minutes it was obvious to me that all computers would work like this someday,” Steve Jobs later said of his 1979 demo of the Xerox Alto).
Being able to put people in a time machine and show them how the next computing platform will look was a highlight of 2024—and of my career so far. But the real impact of Orion will be in the products we ship next and the ways it helps us better understand what people love about AR glasses and what needs to get better. We spent years working on user research, product planning exercises, and experimental studies trying to understand how AR glasses should work, and that work is what enabled us to build Orion. But the pace of progress will be much more rapid from here on out now that we have a real product to build our intuition around.
This has been the lesson time and again over the last decade at Reality Labs. The most important thing you can do when you’re trying to invent the future is to ship things and learn from how real people use them. They won’t always be immediate smash hits, but they’ll always teach you something. And when you land on things that really hit the mark, like mixed reality on Quest 3 or AI on glasses, that’s when you put your foot on the gas. This is what will make 2025 such a special year: With the right devices on the market, people experiencing them for the first time, and developers discovering all the opportunities ahead, it’s time to accelerate.
-
Boz to the Future Episode 22: The Future of Orion and Wearables With Guest Alex Himel
Welcome back for another episode of Boz To The Future, a podcast from Reality Labs. In today’s episode, our host, Meta CTO and Head of Reality Labs Andrew “Boz” Bosworth, is joined by Meta’s VP of Wearables Alex Himel.
Himel’s team works on some of the industry’s hardest technical problems, all in the service of helping to build the next computing platform. Like Bosworth, he’s a tenured leader at Meta, having filled numerous leadership roles across the business in his 15+ years at the company.
Bosworth and Himel cover a range of topics, including wearables, augmented reality, and how AI is accelerating and improving the kinds of experiences embedded in Ray-Ban Meta glasses and future products.
They go deep on Orion and the positive reception it’s received since Mark Zuckerberg announced it at Connect. Just as Steve Jobs recognized the Alto as a turning point in computing, Bosworth asserted that, while we may disagree about the details, it’s hard to argue against this as the future. Himel and Bosworth go on to talk about the tremendous value of prototyping and building, saying how much more they’ve learned about Orion and its input paradigms in the three weeks of using it compared to the years spent building it.
They also talked about the crucial role of AI in wearables, and how it’s becoming increasingly useful in everyday life—especially now that you’re able to ask Meta AI about things you’re seeing and use it to set visual reminders.
Himel also talks about his deep experience in a different kind of stack: sandwiches. With seven years under his belt working in the famous Lange’s Little Store in Chappaqua, NY, where he grew up, Himel and Bosworth get into what makes a great sandwich, where tomatoes really go, and which bread is superior.
You can tune in to Boz to the Future on Apple Podcasts, Spotify, and wherever you listen to podcasts.
You can follow Himel on Threads. You can follow Bosworth on Instagram, Twitter/X, and Threads @boztank.