Publicat pe - Lasă un comentariu

De la zero la unu: modul în care siliciul și cipurile noastre personalizate revoluționează AR

In 2017, Reality Labs Chief Scientist Michael Abrash, backed by Meta Founder & CEO Mark Zuckerberg, established a new, secret team within what was then Oculus Research to build the foundation of the next computing platform. Their herculean task: Create a custom silicon solution that could support the unique demands of future augmented reality glasses—a technical feat that required reimagining every component for an all-day wearable AR glasses form factor that simply didn’t exist yet.

Front angle of Orion AR glasses prototype.

Compact Form Factor, Considerable Problem Space

Growing from just a few researchers to hundreds of people on the product side, the custom silicon team was built on the premise that AR glasses couldn’t rely on the silicon available in today’s smartphones. And it’s a premise borne out by the several custom chips inside Orion, our first true AR glasses product prototype.

“Building the ship as it sails out of the harbor—that was exactly what we were doing,” says Advanced Technologies Strategy Director Jeremy Snodgrass. “We had to scale up what was a modest team while building these chips simultaneously. It was fascinating to see the leadership bring new hires on board while developing a culture that values agility. Nothing was someone else’s problem. You’d pick up an oar and start rowing, even if it wasn’t exactly what you were hired to do. It was a very, very exciting time.”

“Almost everything about Orion was new to the world in a lot of ways,” agrees Display Architecture Technical Director Mike Yee. “Some of the ideas had been out there, but no one had taken on the research project to actually build an all-day wearable pair of AR glasses.”

The team had to deliver a compelling AR experience with as little power consumption as possible. The glasses form factor can only dissipate so much heat, and it can only accommodate so much battery capacity. As a result, the experiences it’s possible to deliver on the glasses are completely dependent upon the silicon. In other words, if you hold constant the thermal and battery capacities, the only way to deliver a given experience is to optimize the silicon.

“Designing a new architecture for Orion required the team to not only push existing technologies like wireless and displays aggressively but also to take risks on new technologies,” says Director of Product Management Neeraj Choubey. “For instance, the team developed a machine learning (ML) accelerator without a clear use case at the time, driven by the strong conviction that ML would become increasingly important in Meta’s products. On Orion, every ML accelerator is utilized, and in some cases, they’re oversubscribed, serving functions such as eye tracking and hand tracking. Similarly, the team developed custom compression protocols to reduce bandwidth and power consumption as data moved from the compute puck to the display. Developing custom silicon to achieve Orion’s form factor goals required both a tolerance for high ambiguity and meticulous attention to detail to deliver an incredibly complex system architecture.”

“The biggest challenge we had was delivering the 3D world-locked rendered graphics, along with spatialized audio that’s rendered such that it appears to be emanating from a virtual object,” notes Snodgrass. “We had to fit all these electronics into the thermal capacity and physical space, and then run it on the battery so that it doesn’t get too hot. And we had to do all that in an actual glasses form factor—not a big visor like you typically see in the category.”

“We knew how to deliver the necessary compute power to bring our vision for Orion to life, but we faced a daunting task: reducing power consumption by a factor of 100,” says Director of SoC Solutions Robert Shearer. “This required us to push the boundaries of silicon design, embracing methodologies from diverse corners of the industry—from IoT to high-performance computing—and inventing new approaches to bridge the gaps. Our industry partners thought we were crazy, and perhaps they weren’t entirely wrong. But that’s exactly what it took: a willingness to challenge conventional wisdom and rethink everything. Building a computer that seamlessly merges the virtual and physical worlds demands a profound understanding of context, far surpassing what existing compute platforms can offer. In essence, we’ve reinvented the way computers interact with humans, which meant reimagining how we build silicon from the ground up.”

Orion exterior components.

The Magic of MicroLEDs

There were moments when things fell behind schedule or a seemingly insurmountable technical challenge arose when it was hard to maintain momentum and morale. But the team was resilient, finding paths around obstacles—or simply knocking them down.

Take, for example, Orion’s display. The silicon team was responsible for the silicon in the display projector that sits in the glasses’ corners.

“For those projectors, there was an open question as to whether we could get the microLEDs in an array at high enough efficiency and brightness to deploy a wide field of view display,” Snodgrass says. “There was tremendous doubt that we could—that it was possible in the time frame we were considering—because this was very nascent technology.”

“We realized very early on that we had to rethink a lot of the paradigms of product development,” adds Yee. “The amount of light that you need to make a usable display is quite a bit brighter for AR glasses because, as a wearable display, you’re competing with the sun. So we need energy levels that rival that—or at least that’s the goal. We’re not quite there yet, but that’s a big piece of it. And that means you need light sources for a display that are capable of that and you need circuitry that can control that. And at the same time, you need to make it tiny.”

Custom silicon driving Orion’s µLEDs.

While microLEDs seemed the most suitable light source for the projectors, silicon helped unlock their potential.

“With displays, we talk about pixel pitches, which are the distances between the center points of adjacent pixels,” Yee explains. “For TVs, those distances are hundreds of microns. In your phone, there are many, many tens of microns. And we needed to get that down into the single digits. The only known semiconductor manufacturing that could get there was silicon.”

Complicating the work was the fact that the back surface of the display had to be a piece of silicon, and no one in the world had been designing silicon for microLEDs.

“At the time, the research teams that were out there had all repurposed liquid crystal on silicon displays to put microLEDs on,” says Yee. “Nobody had ever designed a backplane for microLEDs before. And we faced a pretty unique challenge because it’s an optical component. It has to be flat. You can’t scratch it. It has to have all these characteristics because when you look through the waveguides, through the projectors, you’re literally looking at the top surface of the piece of silicon.”

The silicon team developed a complex series of test platforms for these microLED displays that involved coordinating closely with our global suppliers. The microLEDs have a global footprint, originating in one spot and then transferred to another site where they were put onto a wafer. The wafers were then shipped off to be cut into a particular shape, followed by a trip to the US to be bonded together with another wafer, then shipped back across the globe for the actual module to be built and tested. It was an enormously complicated process, and the silicon team developed test vehicles to prove out each step.

The team also needed to find a way to deliver power to microLED displays in the tiny volume of the glasses’ corners. Our analog team developed a custom power management chip that fit inside that volume.

“Power delivery is crucial for such small form factor wearable devices, where battery size is limited and space is at a premium,” notes Director of Analog and Mixed Signal Systems Jihong Ren. “Our customized Power Management IC solution leverages cutting-edge technologies to optimize power efficiency for our specific workload at the system level, all while fitting within the available space constraints. Achieving this optimal solution required close interdisciplinary collaboration with our mechanical, electrical, SoC, μLED, and thermal teams, ensuring a seamless integration of all components and maximizing overall performance.”

“That was an amazing feat of not just engineering but also organizational management: bringing a team together, working across time zones and with all these different vendors,” adds Snodgrass. “In some ways, managing all that organizationally was just as much of a challenge as meeting the technical specs.”

“Not only is the design custom, the entire fabrication process is customized,” adds Yee. “We’re fortunate to have some wonderful partners in the industry that helped make that happen. They see long-term potential in AR displays as a whole and certainly Meta’s vision for it. So they’ve been willing to partner with us to make those customizations and optimizations happen to enable that display.”

Orion AR glasses.

Iteration Meets Acceleration

There was a tight feedback loop between the silicon team and the brilliant minds in Reality Labs Research and XR Tech developing algorithms. The latter teams would provide these algorithms, which the former would translate into hardware, removing the overhead of general-purpose CPU-running software. That meant that the algorithms would run at lower power, but it also meant the silicon team was locked in. Once the algorithms were hardened, they could no longer make changes.

“Say XR Tech was developing a certain discipline algorithmically,” explains Silicon Accelerators Architecture and Algorithms Director Ohad Meitav. “They own the algorithmic stack and its performance. My team, in collaboration with them, would then decide how to accelerate the algorithm, how to harden parts of the algorithm, and how to actually put it into the hardware in a way that runs super efficiently. Then XR Tech would adapt their software stack to account for the hardware. It’s a very iterative process.”

Another success story is the silicon team’s collaboration with Reality Labs Research to come up with a novel reprojection algorithm.

“We needed the reprojection algorithm to support a variety of different distortions and corrections,” notes Silicon Architect Steve Clohset. “The algorithm RL-R developed that we ended up using is not used in general computing. And to this day, it’s proving to be a pretty powerful tool.”

Once the algorithms were hardened and the hardware was optimized, the silicon bring-up team put the custom chips through their paces.

“Orion’s custom silicon chipset is packed with complexity,” says Senior Director of End-to-End System and Infrastructure Liping Guo. “Bringing-up and validating standalone chips and the interoperabilities across them within a short period of time is incredibly challenging. Luckily, we’re operating within a vertically integrated environment where Reality Labs owns the entire stack—from the silicon and low-level firmware to the operating system, software, and the upper-layer experiences. We took full advantage of this, working closely with our cross-functional partners, and shift-left our cross-stack integration at the silicon validation stage. Orion was our pilot run for this methodology—we built our shift-left muscles and created a strong foundation for Reality Labs to reap the full benefits of custom silicon in the future.”

And after bring-up, it was time to optimize for software.

“There’s an iterative process where you start with a wholly unoptimized software stack, just to get everything booted up and running,” says Snodgrass. “And then, one by one, you go through the subsystems and start to optimize the software for that specific hardware—including reducing the amount of memory that the software uses. The hardware could be beautifully designed, but you won’t achieve the theoretical power efficiency unless you invest as much or more time getting the software to take full advantage of the hardware. So that’s the story of Orion: hardware and software optimized to the hilt. Leave no picojoule or milliwatt behind.”

And while Orion may be a prototype, the work that went into it has considerable potential to influence Meta’s roadmap.

“We look at the silicon IPs that we’re building as platforms in the sense that these are valued IPs that we’ll improve upon from one generation or one product to another,” adds Meitav. “All of the computer vision and graphics algorithms aren’t just built for Orion. They’ll go on to inform future products.”

Our silicon work involves creating novel solutions while also working closely with partners. In fact, the silicon team’s influence extends beyond Orion to both Ochelari Ray-Ban Meta și Meta Quest headsets today—even though they both use third-party chips. The silicon team regularly shares their work with the mixed reality team, showing what’s possible in terms of power efficiency. The MR team then shares those findings with partners like Qualcomm to help inform future chip designs. And because the silicon team uses the same off-the-shelf digital signal processors (DSPs) as Ray-Ban Meta glasses, they’ve been able to share lessons learned and best practices when implementing and writing code for those DSPs to help improve the audio experiences available on our AI glasses.

And that knowledge sharing goes both ways: The MR team has given the silicon team insights on things like Asynchronous TimeWarp și Application SpaceWarp in production.

“What a person does in production is far more interesting than something that we could do with a prototype,” says Clohset. “We tried to integrate what they were doing with warpings as much as possible.”

Ambiguity < Ambition

Because Orion was truly zero to one, the teams involved had to deal with an inordinate amount of ambiguity by necessity.

“With Orion, I can’t overstate how complicated the ambiguity made things,” says Clohset. “When you make a computer, for example, you generally have a good idea of what the display will be. But we didn’t know what the waveguide would end up being, so we had to go sample different waveguides and come up with a mechanism that could handle the worst-case scenario because we didn’t know where things would land. One optimization here would convolve with all these other choices, and you would end up with this matrix of like all these different things that you had as support and try to validate because you didn’t know where the product would land in six months.”

It’s important to note that Orion isn’t just a pair of AR glasses—it’s a three-part constellation of hardware. A lot of the processing happens on the compute puck, which necessitates a strong connection between it and the glasses. Add the surface EMG wristband into the loop, and the system architecture becomes even more complicated.

“That was enormously challenging for the teams to solve, and all of it just works” says Snodgrass. “It was an amazing collaboration between the silicon team, the wireless team, and software teams all across the organization.”

Orion’s compute puck.

“With Orion, we built out a whole team with a broad diversity of engineers, and they were able to design a brand-new pipeline,” adds Clohset. “It’s a pipeline that handles six degrees of freedom movement of objects in 3D space. It uses its own custom display driver. We had to do really unique image quality corrections. And I think the challenge for us was that, because this was a zero-to-one project, there was no existing spec to tweak and improve. Everything here is completely new, so we had latitude to do everything.”

Much like the puck has some dormant features hidden beneath the surface, the custom silicon punches above its weight, too. While Orion doesn’t allow the user to take photos with its RGB cameras, the silicon is capable of supporting it, as well as codec avatars. And just as the puck helped unlock a true glasses form factor by offloading much of the compute, Orion’s custom silicon proved a necessary piece of the AR puzzle.

“To bring a zero-to-one experience like AR glasses to life, you need custom silicon—full stop,” Snodgrass explains. “Over time, if there’s a market, then silicon vendors will develop products to meet the demand. But for zero-to-one, you can’t just take something that exists off the shelf that’s intended for a different product and fit it into a new mold. You have to invest in something custom. And to bring those zero-to-one experiences to life, you need broad collaboration between software partners, industrial designers, mechanical engineers, and more.”

“By breaking free from traditional mental models, we’ve created something truly remarkable,” adds Shearer. “We believe that this computing platform represents the future of technology—one that will revolutionize the way we live, work, and interact with each other. We’re excited to be at the forefront of this innovation, pushing the boundaries of what’s possible and helping to shape the course of history.”


For more information on Orion, check out these blog posts:

Publicat pe - Lasă un comentariu

Accelerating the Future

When we first began giving demos of Orion early this year I was reminded of a line that you hear a lot at Meta—in fact it was even in our first letter to prospective shareholders back in 2012: Code wins arguments. We probably learned as much about this product space from a few months of real-life demos than we did from the years of work it took to make them. There is just no substitute for actually building something, putting it in people’s hands, and learning from how they react to it.

Orion wasn’t our only example of that this year. Our mixed reality hardware și Ochelari AI have both reached a new level of quality and accessibility. The stability of those platforms allows our software developers to move much faster on everything from operating systems to new AI features. This is how I see the metaverse starting to come into greater focus and why I’m so confident that the coming year will be the most important one in the history of Reality Labs.

https://youtube.com/watch?v=xwMNZb6UE1o%3F%26controls%3D0%3Fautoplay%3D0%26color%3Dwhite%26cc_lang_pref%3Den%26cc_load_policy%3D0%26enablejsapi%3D1%26frameborder%3D0%26hl%3Den%26rel%3D0%26origin%3Dhttps%253A%252F%252Fwww.meta.com

2024 was the year AI glasses hit their stride. When we first started making smart glasses in 2021 we thought they could be a nice first step toward the AR glasses we eventually wanted to build. While mixed reality headsets are on track to becoming a general purpose computing platform much like today’s PCs, we saw glasses as the natural evolution of today’s mobile computing platforms. So we wanted to begin learning from the real world as soon as possible.

The biggest thing we’ve learned is that glasses are by far the best form factor for a truly AI-native device. In fact they might be the first hardware category to be completely defined by AI from the beginning. For many people, glasses are the place where an AI assistant makes the most sense, especially when it’s a multimodal system that can truly understand the world around you.

We’re right at the beginning of the S-curve for this entire product category, and there are endless opportunities ahead. One of the things I’m most excited about for 2025 is the evolution of AI assistants into tools that don’t just respond to a prompt when you ask for help but can become a proactive helper as you go about your day. At Connect we showed how Live AI on glasses can become more of a real-time participant when you’re getting things done. As this feature begins rolling out in early access this month we’ll see the first step toward a new kind of personalized AI assistant.

It won’t be the last. We’re currently in the middle of an industry-wide push to make AI-native hardware. You’re seeing phone and PC makers scramble to rebuild their products to put AI assistants at their core. But I think a much bigger opportunity is to make devices that are AI-native from the start, and I’m confident that glasses are going to be the first to get there. Meta’s Chief Research Scientist Michael Abrash has been talking about the potential of a personalized, context-aware AI assistant on glasses for many years, and building the technologies that make it possible has been a huge focus of our research teams.

Mixed reality has been another place where having the right product in the hands of a lot of people has been a major accelerant for progress. We’ve seen Meta Quest 3 get better month after month as we continue to iterate on the core system passthrough, multitasking, spatial user interfaces, and more. All these gains extended to Quest 3S the moment it launched. This meant the $299 Quest 3S was in many respects a better headset on day 1 than the $499 Quest 3 was when it first launched in 2023. And the entire Quest 3 family keeps getting better with every update.

In 2025 we’ll see the next iteration of this as Quest 3S brings a lot more people into mixed reality for the first time. While Quest 3 has been a hit among people excited to own the very best device on the market, Quest 3S is expected to be heavily gifted in 2024. Sales were strong over the Black Friday weekend, and we’re expecting a surge of people activating their new headsets over the holiday break.

This will continue a trend that took shape over the last year: growth in new users who are seeking out a wider range of things to do with their headsets. We want these new users to see the magic of MR and stick around for the long run, which is why we’re funding developers to build the new types of apps and games they’re looking for.

There’s been a significant influx of younger people in particular, and they have been gravitating toward social and competitive multiplayer games, as well as freemium content. And titles like Skydance’s BEHEMOTHBatman: Arkham Shadow (which just won Best AR/VR game at The Game Awards!), and Metro Awakening showed once again that some of the best new games our industry produces are now only possible on today’s headsets.

As the number of people in MR grows, the quality of the social experience it can deliver is growing in tandem. This is at the heart of what Meta is trying to achieve with Reality Labs and where the greatest potential of the metaverse will be unlocked: “the chance to create the most social platform ever” is how we described it when we first began working on it. We took two steps forward on this front in 2024: first with a broad set of improvements to Horizon Worlds including its expansion to mobile, and second with the next-generation Meta Avatars system that lets people represent themselves across our apps and headsets. And as the visual quality and overall experience with these systems improve, more people are getting their first glimpse of a social metaverse. We’re seeing similar trends with new Quest 3S users spending more time in Horizon Worlds, making it a Top 3 immersive app for Quest 3S, and people continue creating new Meta Avatars across mobile and MR.

Getting mixed reality into the mainstream has helped illuminate what will come next. One of the first trends that we discovered after the launch of Quest 3 last year was people using mixed reality to watch videos while multitasking in their home—doing the dishes or vacuuming their living rooms. This was an early signal that people love having a big virtual screen that you can take anywhere and place in the physical world around you. We’ve seen that trend take off with all sorts of entertainment experiences growing fast across the whole Quest 3 family. New features like YouTube Co-Watch show how much potential there is for a whole new kind of social entertainment experience in the metaverse.

That’s why James Cameron, one of the most technologically innovative storytellers of our lifetimes, is now working to help more filmmakers and creators produce great 3D content for Meta Quest. While 3D films have been produced for decades, there’s never been a way to view them that’s quite as good as an MR headset, and next year more people will own a headset than ever before. “We’re at a true, historic inflection point,Jim said when we launched our new partnership this month.

This is happening alongside a larger shift Mark Rabkin shared at Connect this year: Our vision for Horizon OS is to build a new kind of general purpose computing platform capable of running every kind of software, supporting every kind of user, and open to every kind of creator and developer. Our recent releases for 2D/3D multi-tasking, panel positioning, better hand tracking, Windows Remote Desktop integration, and Open Store have started to build momentum along this new path. Horizon OS is on track to be the first platform that supports the full spectrum of experiences from immersive VR to 2D screens, mobile apps, and virtual desktops—and the developer community is central to that success.

The next big step toward the metaverse will be combining AI glasses with the kind of true augmented reality experience we revealed this year with Orion. It’s not often that you get a glimpse of the future and see a totally new technology that shows you where things are heading. The people who saw Orion immediately understood what it meant for the future much like the people who saw first personal computers taking shape at Xerox PARC in the 1970s (“within ten minutes it was obvious to me that all computers would work like this someday,” Steve Jobs later said of his 1979 demo of the Xerox Alto).

Being able to put people in a time machine and show them how the next computing platform will look was a highlight of 2024—and of my career so far. But the real impact of Orion will be in the products we ship next and the ways it helps us better understand what people love about AR glasses and what needs to get better. We spent years working on user research, product planning exercises, and experimental studies trying to understand how AR glasses should work, and that work is what enabled us to build Orion. But the pace of progress will be much more rapid from here on out now that we have a real product to build our intuition around.

This has been the lesson time and again over the last decade at Reality Labs. The most important thing you can do when you’re trying to invent the future is to ship things and learn from how real people use them. They won’t always be immediate smash hits, but they’ll always teach you something. And when you land on things that really hit the mark, like mixed reality on Quest 3 or AI on glasses, that’s when you put your foot on the gas. This is what will make 2025 such a special year: With the right devices on the market, people experiencing them for the first time, and developers discovering all the opportunities ahead, it’s time to accelerate.

Publicat pe - Lasă un comentariu

Boz to the Future Episode 22: The Future of Orion and Wearables With Guest Alex Himel

Welcome back for another episode of Boz To The Future, a podcast from Reality Labs. In today’s episode, our host, Meta CTO and Head of Reality Labs Andrew “Boz” Bosworth, is joined by Meta’s VP of Wearables Alex Himel.

Himel’s team works on some of the industry’s hardest technical problems, all in the service of helping to build the next computing platform. Like Bosworth, he’s a tenured leader at Meta, having filled numerous leadership roles across the business in his 15+ years at the company.

Bosworth and Himel cover a range of topics, including wearables, augmented reality, and how AI is accelerating and improving the kinds of experiences embedded in Ochelari Ray-Ban Meta and future products.

They go deep on Orion and the positive reception it’s received since Mark Zuckerberg announced it at Conectare. Just as Steve Jobs recognized the Alto as a turning point in computing, Bosworth asserted that, while we may disagree about the details, it’s hard to argue against this as the future. Himel and Bosworth go on to talk about the tremendous value of prototyping and building, saying how much more they’ve learned about Orion and its input paradigms in the three weeks of using it compared to the years spent building it.

They also talked about the crucial role of AI in wearables, and how it’s becoming increasingly useful in everyday life—especially now that you’re able to ask Meta AI about things you’re seeing and use it to set visual reminders.

Himel also talks about his deep experience in a different kind of stack: sandwiches. With seven years under his belt working in the famous Lange’s Little Store in Chappaqua, NY, where he grew up, Himel and Bosworth get into what makes a great sandwich, where tomatoes really go, and which bread is superior.

You can tune in to Boz to the Future on Apple PodcastsSpotify, and wherever you listen to podcasts.

You can follow Himel on Threads. You can follow Bosworth on InstagramTwitter/X, and Threads @boztank.

Publicat pe - Lasă un comentariu

Crystal Clear: Our Silicon Carbide Waveguides & the Path to Orion’s Large FoV

0:00 / 0:00

Back in 2019, the Orion team prepared an important demo for Meta Founder & CEO Mark Zuckerberg, showcasing potential waveguides for augmented reality glasses—a pivotal moment when theoretical calculations on paper were brought to life. And it was a demo that changed everything.

“Wearing the glasses with glass-based waveguides and multiple plates, it felt like you were in a disco,” recalls Optical Scientist Pasqual Rivera. “There were rainbows everywhere, and it was so distracting—you weren’t even looking at the AR content. Then, you put on the glasses with silicon carbide waveguides, and it was like you were at the symphony listening to a quiet, classical piece. You could actually pay attention to the full experience of what we were building. It was a total game changer.”

Yet as clear (pun intended) as the choice of silicon carbide as a substrate seems today, when we first started down the road to AR glasses a decade ago, it was anything but.

“Silicon carbide is normally heavily nitrogen-doped,” says Rivera. “It’s green, and if it gets thick enough, it looks black. There’s no way you could make an optical lens out of it. It’s an electronic material. There’s a reason it’s that color, and it’s because of electronic properties.”

“Silicon carbide has been around as a material for a long time,” agrees AR Waveguides Tech Lead Giuseppe Calafiore. “Its main application is high-power electronics. Take electric vehicles: All EVs require a chip—but that chip must also be capable of very high power, moving the wheels, and driving this thing. It turns out you can’t do it with the regular silicon substrate, which is what makes the chips that we use in our computers and electronics. You need a platform that allows you to go through high currents, high power, and that material is silicon carbide.”

Until fairly recent discussions around renewable energy sources started to heat up, the market for these high-power chipsets was nowhere near the size of the market for chips for consumer electronics. Silicon carbide has always been expensive, and there wasn’t much incentive to bring costs down because, for the size of chip that you make for a car, the price of the substrate was tolerable.

“But it turns out that silicon carbide also has some of the properties that we need for waveguides and optics,” Calafiore says. “The refractive index is the key property that we care about. And silicon carbide has a high refractive index, which means that it’s capable of channeling in and outputting a large quantity of optical data. You can think of it as optical bandwidth—just like you have bandwidth for the internet, and you want it to be large enough so that you can send huge amounts of data through that channel. The same goes for optical devices.”

The higher a material’s refractive index, the higher its étendue, so you can send more optical data through that channel.

“The channel in our case is our waveguide, and a larger étendue translates to a larger field of view,” explains Calafiore. “The larger the refractive index of a material, the larger field of view the display can support.”

The Road to the Right Refractive Index

When Calafiore first joined what was then Oculus Research in 2016, the highest refractive index glass the team had at its disposal was 1.8—which required stacking multiple plates to achieve the desired field of view. Undesirable optical artifacts aside, the assembly line became increasingly complicated as the first two waveguides had to be perfectly aligned, and then that stack had to be perfectly aligned with a third waveguide.

“Not only was it expensive, it was immediately obvious that you couldn’t have three pieces of glass per lens in a pair of glasses,” Calafiore recalls. “They were too heavy, and the thickness was prohibitively large and ugly—nobody would buy it. So we went back to square one: trying to boost the refractive index of the substrate to reduce the number of plates needed.”

The first material the team looked into was lithium niobate, which has a refractive index of roughly 2.3—quite a bit higher than the glass at 1.8.

“We realized that we only had to stack two plates and could maybe even manage with one plate to still cover the field of view,” says Calafiore. “Almost in parallel, we started looking at other materials—which is how we figured out alongside our suppliers in 2019 that silicon carbide, in its purest form, is actually very transparent. It also happens to have the highest refractive index known for an optical application, which is 2.7.”

That’s a 17.4% increase over lithium niobate and a 50% bump over glass, for those keeping score at home.

“With a few modifications to the very same equipment that was used in the industry already, it was possible to get transparent silicon carbide,” Calafiore says. “You could just change the process, be a lot more careful, and instead of optimizing for electronic properties, you optimize for optical properties: transparency, uniformity of the refractive index, etc.”

The Potential Cost of Compromise

At that time, the Reality Labs team was the first to even attempt moving from opaque silicon carbide wafers to transparent ones. And because silicon carbide is one of the hardest known materials, it essentially requires diamond tooling to cut or polish it. As a result, the non-recurring engineering costs were very high, so the resulting substrate was quite expensive.

While more cost-effective alternatives exist, as with any technology, they each have tradeoffs. And as the field of view increases towards Orion’s industry-leading field of view of approximately 70 degrees, new issues arise like ghost images and rainbows.

“Finding the optimum solution for a wide field of view AR display is fraught with tradeoffs between performance and costs,” explains Director of Research Science Barry Silverstein. “Costs can often be driven down, but if performance doesn’t cut it, costs won’t ultimately matter.”

Ghost images are like visual echoes of the primary image being projected onto the display. Rainbows are colorful streaks of light that are created when ambient light reflects off of the waveguide. “Say you’re driving at night with moving car lights around you,” says Silverstein. “You’re going to have rainbows that move, too. Or if you’re out on the beach playing volleyball and the sun’s shining, you’re going to get a rainbow streak that’s moving with you and you’ll miss your shot. And one of the miraculous properties of silicon carbide is that it gets rid of those rainbows.”

“The other advantage of silicon carbide that none of the other materials have is thermal conductivity,” Calafiore adds. “Plastic is a terrible insulator. Glass, lithium niobate, same thing. You go to silicon carbide and it’s transparent, looks like glass, and guess what: It conducts heat.”

So in July 2020, the team determined that silicon carbide was the optimum choice for three primary reasons: It led to an improved form factor because it required only a single plate and smaller mounting structures, it had better optical properties, and it was lighter than dual-plate glass.

The Secret of Slant Etch

With the material in mind, the next nut to crack was the fabrication of the waveguides—and specifically, an unconventional grating technique called slant etch.

“The grating is the nanostructure that in-couples and out-couples the light from the lens,” explains Calafiore. “And for silicon carbide to work, the grating needs to be slant etch. Instead of being vertical, you want the lines of the grating to be sloped diagonally.”

“We were the first ones to do slant etch directly on the devices,” says Research Manager Nihar Mohanty. “The whole industry used to rely on nano imprint, which doesn’t work for substrates with such a high refractive index. That’s why no one else in the world had thought about doing silicon carbide.”

But because slant etch is an immature technology, most semiconductor chip suppliers and factories lack the necessary tools.

“Back in 2019, my manager at the time, Matt Colburn, and I established our own facility as there was nothing in the world that could produce etched silicon carbide waveguides and where we could prove out the technology beyond the labscale,” explains Mohanty. “It was a huge investment, and we established the whole pipeline there. The tooling was custom-made for us by our partners, and the process was developed in-house in Meta, though our systems are research-grade because there were no manufacturing-grade systems out there. We worked with a manufacturing partner to develop manufacturing-grade slant etch tooling and processes. And now that we’ve shown what’s possible with silicon carbide, we want others in the industry to start making tools of their own.”

The more companies that invest in optical-grade silicon carbide and develop equipment, the stronger the category of consumer AR glasses will become.

No Longer Chasing Rainbows

While technological inevitability is a myth, the stars certainly seem to be aligning in silicon carbide’s favor. And although the team continues to investigate alternatives, there’s a strong sense that the right people have come together at the right time under the right market conditions to build AR glasses using this material.

“Orion proved that silicon carbide is a viable option for AR glasses,” says Silverstein, “and we’re now seeing interest across the supply chain on three different continents where they’re heavily pursuing this as an opportunity. Silicon carbide will come out on top. It’s just a matter of time in my book.”

And a lot can happen in that time—much like the way the tables have turned since we grew our first clear silicon carbide crystals.

“All of these silicon carbide manufacturers had cranked up the supply massively in response to the expected EV boom,” notes Calafiore. “Right now, there’s an overcapacity that didn’t exist when we were building Orion. So now, because supply is high and demand is low, the cost of the substrate has started to come down.”

“Suppliers are very excited by the new opportunity of manufacturing optical-grade silicon carbide—after all, each waveguide lens represents a large amount of material relative to an electronic chip, and all of their existing capabilities apply to this new space,” adds Silverstein. “Filling your factory is essential, and scaling your factory is the dream. The size of the wafer matters, too: The bigger the wafer, the lower the cost—but the complexity of the process also goes up. That said, we’ve seen suppliers move from four-inch to eight-inch wafers, and some are working on precursors to 12-inch wafers, which would yield exponentially more pairs of AR glasses.”

These advancements should help continue to drive costs down. It’s still early days, but the future is coming into focus.

“At the beginning of any new technological revolution, you try a bunch of things,” says Calafiore. “Look at television: We started with cathode-ray tubes, and then we went to the LED plasma TVs and now microLEDs. We went through several different technologies and architectures. As you pathfind, lots of paths end up nowhere, but there are some that you just keep coming back to as the most promising. We’re not at the end of the road, and we can’t do it alone, but silicon carbide is a wonder material that is well worth the investment.”

“The world is awake now,” adds Silverstein. “We’ve successfully shown that silicon carbide can flex across electronics and photonics. It’s a material that could have future applications in quantum computing. And we’re seeing signs that it’s possible to significantly reduce the cost. There’s a lot of work left to be done, but the potential upside here is huge.”


Learn more about silicon carbide in Photonics Spectra.

For more information on Orion, check out these blog posts:

Publicat pe - Lasă un comentariu

Orion: True AR Glasses Have Arrived

TL;DR:

  • Today at Conectare, Mark Zuckerberg unveiled Orion—our first pair of true AR glasses, previously codenamed Project Nazare.
  • With an industry-leading field of view, silicon carbide lenses, intricate waveguides, uLED projectors, and more, Orion is our most advanced and most polished product prototype to date.
  • Now, we’re continuing product optimizations and lowering costs as we drive toward a scalable consumer device that will revolutionize the way people interact with the world.

“We are building AR glasses.”

Five simple words, spoken five years ago. With them, we planted a flag on our vision of a future where we no longer have to make the false choice between a world of information at our fingertips and the physical world all around us.

And now, five years later, another five words that stand to once again change the game:

We have built AR glasses.

https://youtube.com/watch?v=2CJsnyS8u3c%3Fsi%3DTr77gOKEq3PnA7-k%26controls%3D0%3Fautoplay%3D0%26color%3Dwhite%26cc_lang_pref%3Den%26cc_load_policy%3D0%26enablejsapi%3D1%26frameborder%3D0%26hl%3Den%26rel%3D0%26origin%3Dhttps%253A%252F%252Fwww.meta.com

At Meta, our mission is simple: give people the power to build community and bring the world closer together. And at Reality Labs, we build tools that help people feel connected anytime, anywhere. That’s why we’re working to build the next computing platform that puts people at the center so they can be more present, connected, and empowered in the world.

Ochelari Ray-Ban Meta have demonstrated the power of giving people hands-free access to key parts of their digital lives from their physical ones. We can talk to a smart AI assistant, connect with friends, and capture the moments that matter—all without ever having to pull out a phone. These stylish glasses fit seamlessly into our everyday lives, and people absolutely love them.

Yet while Ray-Ban Meta opened up an entirely new category of display-less glasses super-charged by AI, the XR industry has long dreamt of true AR glasses—a product that combines the benefits of a large holographic display and personalized AI assistance in a comfortable, stylish, all-day wearable form factor.

And today, we’ve pushed that dream closer to reality with the unveiling of Orion, which we believe is the most advanced pair of AR glasses ever made. In fact, it might be the most challenging consumer electronics device produced since the smartphone. Orion is the result of breakthrough inventions in virtually every field of modern computing—building on the work we’ve been doing at Reality Labs for the last decade. It’s packed with entirely new technologies, including the most advanced AR display ever assembled and custom silicon that enables powerful AR experiences to run on a pair of glasses using a fraction of the power and weight of an MR headset.

Orion’s input system seamlessly combines voice, eye gaze, and hand tracking with an EMG wristband that lets you swipe, click, and scroll while keeping your arm resting comfortably by your side, letting you stay present in the world and with the people around you as you interact with rich digital content.

Beginning today at Connect and continuing throughout the year, we’re opening up access to our Orion product prototype for Meta employees and select, external audiences so our development team can learn, iterate, and build towards our consumer AR glasses product line, which we plan to begin shipping in the near future.

Why AR Glasses?

There are three primary reasons why AR glasses are key to unlocking the next great leap in human-oriented computing.

  1. They enable digital experiences that are unconstrained by the limits of a smartphone screen. With large holographic displays, you can use the physical world as your canvas, placing 2D and 3D content and experiences anywhere you want.
  2. They seamlessly integrate contextual AI that can sense and understand the world around you in order to anticipate and proactively address your needs.
  3. They’re lightweight and great for both indoor and outdoor use—and they let people see each other’s real face, real eyes, and real expressions.

That’s the north star our industry has been building towards: a product combining the convenience and immediacy of wearables with a large display, high-bandwidth input, and contextualized AI in a form factor that people feel comfortable wearing in their daily lives.

Compact Form Factor, Compounded Challenges

For years, we’ve faced a false choice—either virtual and mixed reality headsets that enable deep, immersive experiences in a bulky form factor or glasses that are ideal for all-day use but don’t offer rich visual apps and experiences given the lack of a large display and corresponding compute power.

But we want it all, without compromises. For years, we’ve been hard at work to take the incredible spatial experiences afforded by VR and MR headsets and miniaturizing the technology necessary to deliver those experiences into a pair of lightweight, stylish glasses. Nailing the form factor, delivering holographic displays, developing compelling AR experiences, and creating new human-computer interaction (HCI) paradigms—and doing it all in one cohesive product—is one of the most difficult challenges our industry has ever faced. It was so challenging that we thought we had less than a 10 percent chance of pulling it off successfully.

Until now.

A Groundbreaking Display in an Unparalleled Form Factor

At approximately 70 degrees, Orion has the largest field of view in the smallest AR glasses form factor to date. That field of view unlocks truly immersive use cases for Orion, from multitasking windows and big-screen entertainment to lifesize holograms of people—all digital content that can seamlessly blend with your view of the physical world.

For Orion, field of view was our holy grail. We were bumping up against the laws of physics and had to bend beams of light in ways they don’t naturally bend—and we had to do it in a power envelope measured in milliwatts.

Instead of glass, we made the lenses from silicon carbide—a novel application for AR glasses. Silicon carbide is incredibly lightweight, it doesn’t result in optical artifacts or stray light, and it has a high refractive index—all optical properties that are key to a large field of view. The waveguides themselves have really intricate and complex nano-scale 3D structures to diffract or spread light in the ways necessary to achieve this field of view. And the projectors are uLEDs—a new type of display technology that’s super small and extremely power efficient.

Orion is unmistakably a pair of glasses in both look and feel—complete with transparent lenses. Unlike MR headsets or other AR glasses today, you can still see each other’s real eyes and expressions, so you can be present and share the experience with the people around you. Dozens of innovations were required to get the industrial design down to a contemporary glasses form factor that you’d be comfortable wearing every day. Orion is a feat of miniaturization—the components are packed down to a fraction of a millimeter. And we managed to embed seven tiny cameras and sensors in the frame rims.

We had to maintain optical precision at one-tenth the thickness of a strand of human hair. And the system can detect tiny amounts of movement—like the frames expanding or contracting in various room temperatures—and then digitally correct for the necessary optical alignment, all within milliseconds. We made the frames out of magnesium—the same material used in F1 race cars and spacecraft—because it’s lightweight yet rigid, so it keeps the optical elements in alignment and efficiently conducts away heat.

Heating and Cooling

Once we cracked the display (no pun intended) and overcame the physics problems, we had to navigate the challenge of really powerful compute alongside really low power consumption and the need for heat dissipation. Unlike today’s MR headsets, you can’t stuff a fan into a pair of glasses. So we had to get creative. A lot of the materials used to cool Orion are similar to those used by NASA to cool satellites in outer space.

We built highly specialized custom silicon that’s extremely power efficient and optimized for our AI, machine perception, and graphics algorithms. We built multiple custom chips and dozens of highly custom silicon IP blocks inside those chips. That lets us take the algorithms necessary for hand and eye tracking as well as simultaneous localization and mapping (SLAM) technology that normally takes hundreds of milliwatts of power—and thus generates a corresponding amount of heat—and shrink it down to just a few dozen milliwatts.

And yes, custom silicon continues to play a critical role in product development at Reality Labs, despite what you may have read elsewhere. 😎

Effortless EMG

Every new computing platform brings with it a paradigm shift in the way we interact with our devices. The invention of the mouse helped pave the way for the graphical user interfaces (GUIs) that dominate our world today, and smartphones didn’t start to really gain traction until the advent of the touch screen. And the same rule holds true for wearables.

We’ve spoken about our work with electromyography, or EMG, for years, driven by our belief that input for AR glasses needs to be fast, convenient, reliable, subtle, and socially acceptable. And now, that work is ready for prime time.

Orion’s input and interaction system seamlessly combines voice, eye gaze, and hand tracking with an EMG wristband that lets you swipe, click, and scroll with ease.

It works—and feels—like magic. Imagine taking a photo on your morning jog with a simple tap of your fingertips or navigating menus with barely perceptible movements of your hands. Our wristband combines a high-performance textile with embedded EMG sensors to detect the electrical signals generated by even the tiniest muscle movements. An on-device ML processor then interprets those EMG signals to produce input events that are transmitted wirelessly to the glasses. The system adapts to you, so it gets better and more capable at recognizing the most subtle gestures over time. And today, we’re sharing more about our support for external research focused on expanding the equity and accessibility potential of EMG wristbands.

Meet the Wireless Compute Puck

True AR glasses must be wireless, and they must also be small.So we built a wireless compute puck for Orion. It takes some of the load off of the glasses, enabling longer battery life and a better form factor complete with low latency.

The glasses run all of the hand tracking, eye tracking, SLAM, and specialized AR world locking graphics algorithms while the app logic runs on the puck to keep the glasses as lightweight and compact as possible.

The puck has dual processors, including one custom designed right here at Meta, and it provides the compute power necessary for low-latency graphics rendering, AI, and some additional machine perception.

And since it’s small and sleek, you can comfortably toss the puck in a bag or your pocket and go about your business—no strings attached.

AR Experiences

Of course, as with any piece of hardware, it’s only as good as the things you can do with it. And while it’s still early days, the experiences afforded by Orion are an exciting glimpse of what’s to come.

https://youtube.com/watch?v=HkdSv3QPhNw%3F%26controls%3D0%3Fautoplay%3D0%26color%3Dwhite%26cc_lang_pref%3Den%26cc_load_policy%3D0%26enablejsapi%3D1%26frameborder%3D0%26hl%3Den%26rel%3D0%26origin%3Dhttps%253A%252F%252Fwww.meta.com

We’ve got our smart assistant, Meta AI, running on Orion. It understands what you’re looking at in the physical world and can help you with useful visualizations. Orion uses the same Llama model that powers AI experiences running on Ray-Ban Meta smart glasses today, plus custom research models to demonstrate potential use cases for future wearables development.

You can take a hands-free video call on the go to catch up with friends and family in real time, and you can stay connected on WhatsApp and Messenger to view and send messages. No need to pull out your phone, unlock it, find the right app, and let your friend know you’re running late for dinner—you can do it all through your glasses.

https://youtube.com/watch?v=el7lUVvu8Bo%3F%26controls%3D0%3Fautoplay%3D0%26color%3Dwhite%26cc_lang_pref%3Den%26cc_load_policy%3D0%26enablejsapi%3D1%26frameborder%3D0%26hl%3Den%26rel%3D0%26origin%3Dhttps%253A%252F%252Fwww.meta.com

You can play shared AR games with family on the other side of the country or with your friend on the other side of the couch. And Orion’s large display lets you multitask with multiple windows to get stuff done without having to lug around your laptop.

The experiences available on Orion today will help chart the roadmap for our consumer AR glasses line in the future. Our teams will continue to iterate and build new immersive social experiences, alongside our developer partners, and we can’t wait to share what comes next.

A Purposeful Product Prototype

While Orion won’t make its way into the hands of consumers, make no mistake: This is not a research prototype. It’s the most polished product prototype we’ve ever developed—and it’s truly representative of something that could ship to consumers. Rather than rushing to put it on shelves, we decided to focus on internal development first, which means we can keep building quickly and continue to push the boundaries of the technology and experiences.

And that means we’ll arrive at an even better consumer product faster.

What Comes Next

Two major obstacles have long stood in the way of mainstream consumer-grade AR glasses: the technological breakthroughs necessary to deliver a large display in a compact glasses form factor and the necessity of useful and compelling AR experiences that can run on those glasses. Orion is a huge milestone, delivering true AR experiences running on reasonably stylish hardware for the very first time.

Now that we’ve shared Orion with the world, we’re focused on a few things:

  • Tuning the AR display quality to make the visuals even sharper
  • Optimizing wherever we can to make the form factor even smaller
  • Building at scale to make them more affordable

In the next few years, you can expect to see new devices from us that build on our R&D efforts. And a number of Orion’s innovations have been extended to our current consumer products today as well as our future product roadmap. We’ve optimized some of our spatial perception algorithms, which are running on both Meta Quest 3S and Orion. While the eye gaze and subtle gestural input system was originally designed for Orion, we plan to use it in future products. And we’re exploring the use of EMG wristbands across future consumer products as well.

Orion isn’t just a window into the future—it’s a look at the very real possibilities within reach today. We built it in our pursuit of what we do best: helping people connect. From Ray-Ban Meta glasses to Orion, we’ve seen the good that can come from letting people stay more present and empowered in the physical world, even while tapping into all the added richness the digital world has to offer.

We believe you shouldn’t have to choose between the two. And with the next computing platform, you won’t have to.


For more information on Orion, check out these blog posts:

Publicat pe - Lasă un comentariu

Orion’s Compute Puck: The Story Behind the Device that Helped Make Our AR Glasses Possible

Last year at Connect, we unveiled Orion—our first true pair of AR glasses. The culmination of our work at Reality Labs over the last decade, Orion combines the benefits of a large holographic display and personalized AI assistance in a comfortable, all-day wearable form factor. It got some attention for its industry-leading field of view, silicon carbide waveguides, uLED projectors, and more. But today, we’re turning our attention to an unsung hero of Orion: the compute puck.

Designed to easily slip into your pocket or bag so you can bring it just about anywhere as you go about your day, the puck offloads Orion’s processing power to run application logic and enable a more compelling, smaller form factor for the glasses. It connects wirelessly to the glasses and EMG wristband for a seamless experience.

Set it and forget it, right?

But the puck’s backstory—even as a prototype—is involved, with a dramatic arc you’d never guess at from its appearance.

“When you’re building something like this, you start getting into the limits of physics,” explains Director of Product Management Rahul Prasad. “For the last 50 years, Moore’s Law has made everything smaller, faster, and lower power. The problem is that now you’re starting to hit limits on how much heat you can dissipate, how much battery you can compress, and how much antenna performance you can fit into a particular sized object.”

While hindsight may be 20/20, the puck’s potential wasn’t immediately obvious. When you’re the first to build something, you need to explore every possibility—leaving no stone unturned. How do you build something that some might think of as an undesirable accessory rather than a critical part of the package?

“We knew the puck was an extra device we were asking people to carry,” notes Product Design Engineering Manager Jared Van Cleave, “so we explored how to turn the bug into a feature.”

Ultimately, that ethos paid off in spades as the puck squeezes a lot of compute (and even more Meta-designed custom silicon for AI and machine perception) into a small size. This was instrumental in helping Orion go from the realm of science fiction into reality.

“If you didn’t have the puck, you wouldn’t be able to have the experiences that Orion offers in its form factor—period,” says Prasad. “A good AR experience demands really high performance: high frame rates, extremely low latency, fine-grained wireless and power management, etc. The puck and the glasses need to be co-designed to work really closely together, not just at the app layer, but also at the operating system, firmware, and hardware layers. And even if one were to co-design a smartphone to work with AR glasses, the demanding performance requirements would drain the phone battery and suck away compute capacity from phone use cases. On the other hand, the puck has its own high-capacity battery, high-performance SoC, and a custom Meta-designed AI co-processor optimized for Orion.”

Of course, the puck wasn’t designed overnight. It required years of iterative work.

“We didn’t know how people would want to interact with Orion from an input perspective—there was nothing we could draft off of in-market,” says Product Manager Matt Resman. “If you look at our early glasses prototypes, they were these massive headsets that weighed three or four pounds. And when you’re trying to build a product, it’s really difficult to understand the user experience if you’re not in the right form factor. With the puck, we were able to prototype very quickly to start understanding how people would use it.”

Codenamed Omega in the early days, the puck was initially envisioned by what was then Oculus Research as an Ω-shaped band that would go around the user’s neck and be hard-wired to the glasses…

… until some new innovations by the Reality Labs wireless team, among other things, allowed them to cut the cord. That enabled a more handheld or pocketable / in-bag form factor, which opened up a lot of possibilities.

“At that point, augmented reality calling was still a primary use case,” Van Cleave explains. “The puck was where the holographic videos would be anchored. You’d put it down on the table with the sensor bench facing you, imaging you, and then projecting who you were speaking to from the puck’s surface for the call.”

“Orion is about connecting people and bringing us together,” says Resman. “One of the initial concepts for the puck was to help create this sense of presence with other people and enable this richer form of communication.”

“It’s unlike anything you’ve ever seen—the device has the potential to create really fun and unique interactions,” adds Industrial Designer Emron Henry. “The user experience feels a bit like unleashing a genie from a bottle, where holograms seamlessly emerge from and dissolve back into the device.”

As you can probably tell by now, there’s more potential inside the puck than what ultimately got turned on as a feature. In addition to the sensors and cameras that would’ve been used for AR calling, the puck has haptics and 6DOF sensors that could enable it to be used as a tracked controller to select and manipulate virtual objects and play AR games. The team also explored capacitive and force touch input so the puck could serve as a gamepad when held in both portrait and landscape mode.

“We would talk about the need to carry this extra thing,” says Van Cleave. “How do we make it more useful? There was a whole workstream around it. And at one point, the hypothesis was that AR gaming was going to be this killer use case.”

Early on, we knew we needed to explore the possibilities of the puck doing things that phones cannot. Eventually, we landed on using eye gaze, EMG, and hand tracking for AR games, like Stargazer and Pong, but we prototyped various demos and games that used the puck as a controller in the early days of Orion.

0:00 / 0:00

0:00 / 0:00

0:00 / 0:00

Rendered explorations of the puck as a 6DOF controller for AR games like Pong.

“Because it’s not a phone, that gave us a lot of design freedom,” adds Prasad. “It can be thicker, I can make it more rounded so it fits comfortably in your hand as a controller. It’s pretty incredible that the puck is actually smaller than the average phone, but it’s more powerful than a phone because it has a Meta-designed co-processor for AI and machine perception.”

The team dug into the question of how a quality AR game might differ from a MR or console game. That meant exploring different affordances for an AR game controller, including joysticks, physical button layouts, trigger buttons, and more.

“We didn’t end up actually building any of that stuff,” Van Cleave notes. “We prototyped it, but we didn’t ever go for a full build. We wanted to keep it simple. We wanted to do these soft interfaces but not make physical, mechanical buttons.”

And while the sensors and haptics weren’t turned on in the finished product prototype, they served an integral function during development, letting teams file bugs by simply tapping on the top of the device a few times to trigger a bug report.

As AI began to take center stage as a key use case, compute came into increasingly sharp focus. In the end, the puck is home to Orion’s wireless connectivity, computing power, and battery capacity—a huge technical feat in itself—all of which helps reduce the weight and form factor of the glasses while dissipating a lot more thermals by virtue of its surface area.

Throughout its evolution, one thing has remained the same: There’s more to the puck than meets the untrained eye. Embracing unconventional ideas allowed our teams to explore, push boundaries, and build the future.

“We’re defining a category that doesn’t quite exist yet,” notes Henry. “As you’d expect with R&D, there were starts and stops along the way. How will users expect to interact with holograms? Would they prefer to use an AR remote or is hand tracking, eye gaze, and EMG sufficient for input? What feels intuitive, low-friction, familiar, and useful?”

“Rather than looking at the puck as just a rock, we asked ourselves what else we could provide to further differentiate it from phones and justify why you’d want to carry it around,” acknowledges Resman. “Does its raw compute power and practical design—which helped unlock a glasses form factor that you can realistically wear around all day—ultimately offer enough value? It’s our job to help answer these questions.”

“Early on, we shifted gears to focus on what we can do that a phone can’t do,” Van Cleave adds. “Phones need to have the screen, they need to have the certain physical button layout that users expect. And we don’t have those constraints. Our compute puck can be whatever we want it to be.”


For more information on Orion, check out these blog posts:

Publicat pe - Lasă un comentariu

Orion: True AR Glasses Have Arrived

TL;DR:

  • Today at Conectare, Mark Zuckerberg unveiled Orion—our first pair of true AR glasses, previously codenamed Project Nazare.
  • With an industry-leading field of view, silicon carbide lenses, intricate waveguides, uLED projectors, and more, Orion is our most advanced and most polished product prototype to date.
  • Now, we’re continuing product optimizations and lowering costs as we drive toward a scalable consumer device that will revolutionize the way people interact with the world.

“We are building AR glasses.”

Five simple words, spoken five years ago. With them, we planted a flag on our vision of a future where we no longer have to make the false choice between a world of information at our fingertips and the physical world all around us.

And now, five years later, another five words that stand to once again change the game:

We have built AR glasses.

https://youtube.com/watch?v=2CJsnyS8u3c%3Fsi%3DTr77gOKEq3PnA7-k%26controls%3D0%3Fautoplay%3D0%26color%3Dwhite%26cc_lang_pref%3Den%26cc_load_policy%3D0%26enablejsapi%3D1%26frameborder%3D0%26hl%3Den%26rel%3D0%26origin%3Dhttps%253A%252F%252Fwww.meta.com

At Meta, our mission is simple: give people the power to build community and bring the world closer together. And at Reality Labs, we build tools that help people feel connected anytime, anywhere. That’s why we’re working to build the next computing platform that puts people at the center so they can be more present, connected, and empowered in the world.

Ochelari Ray-Ban Meta have demonstrated the power of giving people hands-free access to key parts of their digital lives from their physical ones. We can talk to a smart AI assistant, connect with friends, and capture the moments that matter—all without ever having to pull out a phone. These stylish glasses fit seamlessly into our everyday lives, and people absolutely love them.

Yet while Ray-Ban Meta opened up an entirely new category of display-less glasses super-charged by AI, the XR industry has long dreamt of true AR glasses—a product that combines the benefits of a large holographic display and personalized AI assistance in a comfortable, stylish, all-day wearable form factor.

And today, we’ve pushed that dream closer to reality with the unveiling of Orion, which we believe is the most advanced pair of AR glasses ever made. In fact, it might be the most challenging consumer electronics device produced since the smartphone. Orion is the result of breakthrough inventions in virtually every field of modern computing—building on the work we’ve been doing at Reality Labs for the last decade. It’s packed with entirely new technologies, including the most advanced AR display ever assembled and custom silicon that enables powerful AR experiences to run on a pair of glasses using a fraction of the power and weight of an MR headset.

Orion’s input system seamlessly combines voice, eye gaze, and hand tracking with an EMG wristband that lets you swipe, click, and scroll while keeping your arm resting comfortably by your side, letting you stay present in the world and with the people around you as you interact with rich digital content.

Beginning today at Connect and continuing throughout the year, we’re opening up access to our Orion product prototype for Meta employees and select, external audiences so our development team can learn, iterate, and build towards our consumer AR glasses product line, which we plan to begin shipping in the near future.

Why AR Glasses?

There are three primary reasons why AR glasses are key to unlocking the next great leap in human-oriented computing.

  1. They enable digital experiences that are unconstrained by the limits of a smartphone screen. With large holographic displays, you can use the physical world as your canvas, placing 2D and 3D content and experiences anywhere you want.
  2. They seamlessly integrate contextual AI that can sense and understand the world around you in order to anticipate and proactively address your needs.
  3. They’re lightweight and great for both indoor and outdoor use—and they let people see each other’s real face, real eyes, and real expressions.

That’s the north star our industry has been building towards: a product combining the convenience and immediacy of wearables with a large display, high-bandwidth input, and contextualized AI in a form factor that people feel comfortable wearing in their daily lives.

Compact Form Factor, Compounded Challenges

For years, we’ve faced a false choice—either virtual and mixed reality headsets that enable deep, immersive experiences in a bulky form factor or glasses that are ideal for all-day use but don’t offer rich visual apps and experiences given the lack of a large display and corresponding compute power.

But we want it all, without compromises. For years, we’ve been hard at work to take the incredible spatial experiences afforded by VR and MR headsets and miniaturizing the technology necessary to deliver those experiences into a pair of lightweight, stylish glasses. Nailing the form factor, delivering holographic displays, developing compelling AR experiences, and creating new human-computer interaction (HCI) paradigms—and doing it all in one cohesive product—is one of the most difficult challenges our industry has ever faced. It was so challenging that we thought we had less than a 10 percent chance of pulling it off successfully.

Until now.

A Groundbreaking Display in an Unparalleled Form Factor

At approximately 70 degrees, Orion has the largest field of view in the smallest AR glasses form factor to date. That field of view unlocks truly immersive use cases for Orion, from multitasking windows and big-screen entertainment to lifesize holograms of people—all digital content that can seamlessly blend with your view of the physical world.

For Orion, field of view was our holy grail. We were bumping up against the laws of physics and had to bend beams of light in ways they don’t naturally bend—and we had to do it in a power envelope measured in milliwatts.

Instead of glass, we made the lenses from silicon carbide—a novel application for AR glasses. Silicon carbide is incredibly lightweight, it doesn’t result in optical artifacts or stray light, and it has a high refractive index—all optical properties that are key to a large field of view. The waveguides themselves have really intricate and complex nano-scale 3D structures to diffract or spread light in the ways necessary to achieve this field of view. And the projectors are uLEDs—a new type of display technology that’s super small and extremely power efficient.

Orion is unmistakably a pair of glasses in both look and feel—complete with transparent lenses. Unlike MR headsets or other AR glasses today, you can still see each other’s real eyes and expressions, so you can be present and share the experience with the people around you. Dozens of innovations were required to get the industrial design down to a contemporary glasses form factor that you’d be comfortable wearing every day. Orion is a feat of miniaturization—the components are packed down to a fraction of a millimeter. And we managed to embed seven tiny cameras and sensors in the frame rims.

We had to maintain optical precision at one-tenth the thickness of a strand of human hair. And the system can detect tiny amounts of movement—like the frames expanding or contracting in various room temperatures—and then digitally correct for the necessary optical alignment, all within milliseconds. We made the frames out of magnesium—the same material used in F1 race cars and spacecraft—because it’s lightweight yet rigid, so it keeps the optical elements in alignment and efficiently conducts away heat.

Heating and Cooling

Once we cracked the display (no pun intended) and overcame the physics problems, we had to navigate the challenge of really powerful compute alongside really low power consumption and the need for heat dissipation. Unlike today’s MR headsets, you can’t stuff a fan into a pair of glasses. So we had to get creative. A lot of the materials used to cool Orion are similar to those used by NASA to cool satellites in outer space.

We built highly specialized custom silicon that’s extremely power efficient and optimized for our AI, machine perception, and graphics algorithms. We built multiple custom chips and dozens of highly custom silicon IP blocks inside those chips. That lets us take the algorithms necessary for hand and eye tracking as well as simultaneous localization and mapping (SLAM) technology that normally takes hundreds of milliwatts of power—and thus generates a corresponding amount of heat—and shrink it down to just a few dozen milliwatts.

And yes, custom silicon continues to play a critical role in product development at Reality Labs, despite what you may have read elsewhere. 😎

Effortless EMG

Every new computing platform brings with it a paradigm shift in the way we interact with our devices. The invention of the mouse helped pave the way for the graphical user interfaces (GUIs) that dominate our world today, and smartphones didn’t start to really gain traction until the advent of the touch screen. And the same rule holds true for wearables.

We’ve spoken about our work with electromyography, or EMG, for years, driven by our belief that input for AR glasses needs to be fast, convenient, reliable, subtle, and socially acceptable. And now, that work is ready for prime time.

Orion’s input and interaction system seamlessly combines voice, eye gaze, and hand tracking with an EMG wristband that lets you swipe, click, and scroll with ease.

It works—and feels—like magic. Imagine taking a photo on your morning jog with a simple tap of your fingertips or navigating menus with barely perceptible movements of your hands. Our wristband combines a high-performance textile with embedded EMG sensors to detect the electrical signals generated by even the tiniest muscle movements. An on-device ML processor then interprets those EMG signals to produce input events that are transmitted wirelessly to the glasses. The system adapts to you, so it gets better and more capable at recognizing the most subtle gestures over time. And today, we’re sharing more about our support for external research focused on expanding the equity and accessibility potential of EMG wristbands.

Meet the Wireless Compute Puck

True AR glasses must be wireless, and they must also be small.So we built a wireless compute puck for Orion. It takes some of the load off of the glasses, enabling longer battery life and a better form factor complete with low latency.

The glasses run all of the hand tracking, eye tracking, SLAM, and specialized AR world locking graphics algorithms while the app logic runs on the puck to keep the glasses as lightweight and compact as possible.

The puck has dual processors, including one custom designed right here at Meta, and it provides the compute power necessary for low-latency graphics rendering, AI, and some additional machine perception.

And since it’s small and sleek, you can comfortably toss the puck in a bag or your pocket and go about your business—no strings attached.

AR Experiences

Of course, as with any piece of hardware, it’s only as good as the things you can do with it. And while it’s still early days, the experiences afforded by Orion are an exciting glimpse of what’s to come.

https://youtube.com/watch?v=HkdSv3QPhNw%3F%26controls%3D0%3Fautoplay%3D0%26color%3Dwhite%26cc_lang_pref%3Den%26cc_load_policy%3D0%26enablejsapi%3D1%26frameborder%3D0%26hl%3Den%26rel%3D0%26origin%3Dhttps%253A%252F%252Fwww.meta.com

We’ve got our smart assistant, Meta AI, running on Orion. It understands what you’re looking at in the physical world and can help you with useful visualizations. Orion uses the same Llama model that powers AI experiences running on Ray-Ban Meta smart glasses today, plus custom research models to demonstrate potential use cases for future wearables development.

You can take a hands-free video call on the go to catch up with friends and family in real time, and you can stay connected on WhatsApp and Messenger to view and send messages. No need to pull out your phone, unlock it, find the right app, and let your friend know you’re running late for dinner—you can do it all through your glasses.

https://youtube.com/watch?v=el7lUVvu8Bo%3F%26controls%3D0%3Fautoplay%3D0%26color%3Dwhite%26cc_lang_pref%3Den%26cc_load_policy%3D0%26enablejsapi%3D1%26frameborder%3D0%26hl%3Den%26rel%3D0%26origin%3Dhttps%253A%252F%252Fwww.meta.com

You can play shared AR games with family on the other side of the country or with your friend on the other side of the couch. And Orion’s large display lets you multitask with multiple windows to get stuff done without having to lug around your laptop.

The experiences available on Orion today will help chart the roadmap for our consumer AR glasses line in the future. Our teams will continue to iterate and build new immersive social experiences, alongside our developer partners, and we can’t wait to share what comes next.

A Purposeful Product Prototype

While Orion won’t make its way into the hands of consumers, make no mistake: This is not a research prototype. It’s the most polished product prototype we’ve ever developed—and it’s truly representative of something that could ship to consumers. Rather than rushing to put it on shelves, we decided to focus on internal development first, which means we can keep building quickly and continue to push the boundaries of the technology and experiences.

And that means we’ll arrive at an even better consumer product faster.

What Comes Next

Two major obstacles have long stood in the way of mainstream consumer-grade AR glasses: the technological breakthroughs necessary to deliver a large display in a compact glasses form factor and the necessity of useful and compelling AR experiences that can run on those glasses. Orion is a huge milestone, delivering true AR experiences running on reasonably stylish hardware for the very first time.

Now that we’ve shared Orion with the world, we’re focused on a few things:

  • Tuning the AR display quality to make the visuals even sharper
  • Optimizing wherever we can to make the form factor even smaller
  • Building at scale to make them more affordable

In the next few years, you can expect to see new devices from us that build on our R&D efforts. And a number of Orion’s innovations have been extended to our current consumer products today as well as our future product roadmap. We’ve optimized some of our spatial perception algorithms, which are running on both Meta Quest 3S and Orion. While the eye gaze and subtle gestural input system was originally designed for Orion, we plan to use it in future products. And we’re exploring the use of EMG wristbands across future consumer products as well.

Orion isn’t just a window into the future—it’s a look at the very real possibilities within reach today. We built it in our pursuit of what we do best: helping people connect. From Ray-Ban Meta glasses to Orion, we’ve seen the good that can come from letting people stay more present and empowered in the physical world, even while tapping into all the added richness the digital world has to offer.

We believe you shouldn’t have to choose between the two. And with the next computing platform, you won’t have to.


For more information on Orion, check out these blog posts: