Δημοσιεύθηκε την - Σχολιάστε

Κρυστάλλινη καθαρότητα: Οι κυματοδηγοί μας από καρβίδιο πυριτίου και η πορεία προς το μεγάλο FoV του Orion

0:00 / 0:00

Back in 2019, the Orion team prepared an important demo for Meta Founder & CEO Mark Zuckerberg, showcasing potential waveguides for augmented reality glasses—a pivotal moment when theoretical calculations on paper were brought to life. And it was a demo that changed everything.

“Wearing the glasses with glass-based waveguides and multiple plates, it felt like you were in a disco,” recalls Optical Scientist Pasqual Rivera. “There were rainbows everywhere, and it was so distracting—you weren’t even looking at the AR content. Then, you put on the glasses with silicon carbide waveguides, and it was like you were at the symphony listening to a quiet, classical piece. You could actually pay attention to the full experience of what we were building. It was a total game changer.”

Yet as clear (pun intended) as the choice of silicon carbide as a substrate seems today, when we first started down the road to AR glasses a decade ago, it was anything but.

“Silicon carbide is normally heavily nitrogen-doped,” says Rivera. “It’s green, and if it gets thick enough, it looks black. There’s no way you could make an optical lens out of it. It’s an electronic material. There’s a reason it’s that color, and it’s because of electronic properties.”

“Silicon carbide has been around as a material for a long time,” agrees AR Waveguides Tech Lead Giuseppe Calafiore. “Its main application is high-power electronics. Take electric vehicles: All EVs require a chip—but that chip must also be capable of very high power, moving the wheels, and driving this thing. It turns out you can’t do it with the regular silicon substrate, which is what makes the chips that we use in our computers and electronics. You need a platform that allows you to go through high currents, high power, and that material is silicon carbide.”

Until fairly recent discussions around renewable energy sources started to heat up, the market for these high-power chipsets was nowhere near the size of the market for chips for consumer electronics. Silicon carbide has always been expensive, and there wasn’t much incentive to bring costs down because, for the size of chip that you make for a car, the price of the substrate was tolerable.

“But it turns out that silicon carbide also has some of the properties that we need for waveguides and optics,” Calafiore says. “The refractive index is the key property that we care about. And silicon carbide has a high refractive index, which means that it’s capable of channeling in and outputting a large quantity of optical data. You can think of it as optical bandwidth—just like you have bandwidth for the internet, and you want it to be large enough so that you can send huge amounts of data through that channel. The same goes for optical devices.”

The higher a material’s refractive index, the higher its étendue, so you can send more optical data through that channel.

“The channel in our case is our waveguide, and a larger étendue translates to a larger field of view,” explains Calafiore. “The larger the refractive index of a material, the larger field of view the display can support.”

The Road to the Right Refractive Index

When Calafiore first joined what was then Oculus Research in 2016, the highest refractive index glass the team had at its disposal was 1.8—which required stacking multiple plates to achieve the desired field of view. Undesirable optical artifacts aside, the assembly line became increasingly complicated as the first two waveguides had to be perfectly aligned, and then that stack had to be perfectly aligned with a third waveguide.

“Not only was it expensive, it was immediately obvious that you couldn’t have three pieces of glass per lens in a pair of glasses,” Calafiore recalls. “They were too heavy, and the thickness was prohibitively large and ugly—nobody would buy it. So we went back to square one: trying to boost the refractive index of the substrate to reduce the number of plates needed.”

The first material the team looked into was lithium niobate, which has a refractive index of roughly 2.3—quite a bit higher than the glass at 1.8.

“We realized that we only had to stack two plates and could maybe even manage with one plate to still cover the field of view,” says Calafiore. “Almost in parallel, we started looking at other materials—which is how we figured out alongside our suppliers in 2019 that silicon carbide, in its purest form, is actually very transparent. It also happens to have the highest refractive index known for an optical application, which is 2.7.”

That’s a 17.4% increase over lithium niobate and a 50% bump over glass, for those keeping score at home.

“With a few modifications to the very same equipment that was used in the industry already, it was possible to get transparent silicon carbide,” Calafiore says. “You could just change the process, be a lot more careful, and instead of optimizing for electronic properties, you optimize for optical properties: transparency, uniformity of the refractive index, etc.”

The Potential Cost of Compromise

At that time, the Reality Labs team was the first to even attempt moving from opaque silicon carbide wafers to transparent ones. And because silicon carbide is one of the hardest known materials, it essentially requires diamond tooling to cut or polish it. As a result, the non-recurring engineering costs were very high, so the resulting substrate was quite expensive.

While more cost-effective alternatives exist, as with any technology, they each have tradeoffs. And as the field of view increases towards Orion’s industry-leading field of view of approximately 70 degrees, new issues arise like ghost images and rainbows.

“Finding the optimum solution for a wide field of view AR display is fraught with tradeoffs between performance and costs,” explains Director of Research Science Barry Silverstein. “Costs can often be driven down, but if performance doesn’t cut it, costs won’t ultimately matter.”

Ghost images are like visual echoes of the primary image being projected onto the display. Rainbows are colorful streaks of light that are created when ambient light reflects off of the waveguide. “Say you’re driving at night with moving car lights around you,” says Silverstein. “You’re going to have rainbows that move, too. Or if you’re out on the beach playing volleyball and the sun’s shining, you’re going to get a rainbow streak that’s moving with you and you’ll miss your shot. And one of the miraculous properties of silicon carbide is that it gets rid of those rainbows.”

“The other advantage of silicon carbide that none of the other materials have is thermal conductivity,” Calafiore adds. “Plastic is a terrible insulator. Glass, lithium niobate, same thing. You go to silicon carbide and it’s transparent, looks like glass, and guess what: It conducts heat.”

So in July 2020, the team determined that silicon carbide was the optimum choice for three primary reasons: It led to an improved form factor because it required only a single plate and smaller mounting structures, it had better optical properties, and it was lighter than dual-plate glass.

The Secret of Slant Etch

With the material in mind, the next nut to crack was the fabrication of the waveguides—and specifically, an unconventional grating technique called slant etch.

“The grating is the nanostructure that in-couples and out-couples the light from the lens,” explains Calafiore. “And for silicon carbide to work, the grating needs to be slant etch. Instead of being vertical, you want the lines of the grating to be sloped diagonally.”

“We were the first ones to do slant etch directly on the devices,” says Research Manager Nihar Mohanty. “The whole industry used to rely on nano imprint, which doesn’t work for substrates with such a high refractive index. That’s why no one else in the world had thought about doing silicon carbide.”

But because slant etch is an immature technology, most semiconductor chip suppliers and factories lack the necessary tools.

“Back in 2019, my manager at the time, Matt Colburn, and I established our own facility as there was nothing in the world that could produce etched silicon carbide waveguides and where we could prove out the technology beyond the labscale,” explains Mohanty. “It was a huge investment, and we established the whole pipeline there. The tooling was custom-made for us by our partners, and the process was developed in-house in Meta, though our systems are research-grade because there were no manufacturing-grade systems out there. We worked with a manufacturing partner to develop manufacturing-grade slant etch tooling and processes. And now that we’ve shown what’s possible with silicon carbide, we want others in the industry to start making tools of their own.”

The more companies that invest in optical-grade silicon carbide and develop equipment, the stronger the category of consumer AR glasses will become.

No Longer Chasing Rainbows

While technological inevitability is a myth, the stars certainly seem to be aligning in silicon carbide’s favor. And although the team continues to investigate alternatives, there’s a strong sense that the right people have come together at the right time under the right market conditions to build AR glasses using this material.

“Orion proved that silicon carbide is a viable option for AR glasses,” says Silverstein, “and we’re now seeing interest across the supply chain on three different continents where they’re heavily pursuing this as an opportunity. Silicon carbide will come out on top. It’s just a matter of time in my book.”

And a lot can happen in that time—much like the way the tables have turned since we grew our first clear silicon carbide crystals.

“All of these silicon carbide manufacturers had cranked up the supply massively in response to the expected EV boom,” notes Calafiore. “Right now, there’s an overcapacity that didn’t exist when we were building Orion. So now, because supply is high and demand is low, the cost of the substrate has started to come down.”

“Suppliers are very excited by the new opportunity of manufacturing optical-grade silicon carbide—after all, each waveguide lens represents a large amount of material relative to an electronic chip, and all of their existing capabilities apply to this new space,” adds Silverstein. “Filling your factory is essential, and scaling your factory is the dream. The size of the wafer matters, too: The bigger the wafer, the lower the cost—but the complexity of the process also goes up. That said, we’ve seen suppliers move from four-inch to eight-inch wafers, and some are working on precursors to 12-inch wafers, which would yield exponentially more pairs of AR glasses.”

These advancements should help continue to drive costs down. It’s still early days, but the future is coming into focus.

“At the beginning of any new technological revolution, you try a bunch of things,” says Calafiore. “Look at television: We started with cathode-ray tubes, and then we went to the LED plasma TVs and now microLEDs. We went through several different technologies and architectures. As you pathfind, lots of paths end up nowhere, but there are some that you just keep coming back to as the most promising. We’re not at the end of the road, and we can’t do it alone, but silicon carbide is a wonder material that is well worth the investment.”

“The world is awake now,” adds Silverstein. “We’ve successfully shown that silicon carbide can flex across electronics and photonics. It’s a material that could have future applications in quantum computing. And we’re seeing signs that it’s possible to significantly reduce the cost. There’s a lot of work left to be done, but the potential upside here is huge.”


Learn more about silicon carbide in Photonics Spectra.

Για περισσότερες πληροφορίες σχετικά με τον Orion, ανατρέξτε σε αυτές τις αναρτήσεις ιστολογίου:

Δημοσιεύθηκε την - Σχολιάστε

Orion: AR γυαλιά έχουν φτάσει

TL;DR:

  • Today at Συνδέστε το, Mark Zuckerberg unveiled Orion—our first pair of true AR glasses, previously codenamed Project Nazare.
  • With an industry-leading field of view, silicon carbide lenses, intricate waveguides, uLED projectors, and more, Orion is our most advanced and most polished product prototype to date.
  • Now, we’re continuing product optimizations and lowering costs as we drive toward a scalable consumer device that will revolutionize the way people interact with the world.

“We are building AR glasses.”

Five simple words, spoken five years ago. With them, we planted a flag on our vision of a future where we no longer have to make the false choice between a world of information at our fingertips and the physical world all around us.

And now, five years later, another five words that stand to once again change the game:

We have built AR glasses.

https://youtube.com/watch?v=2CJsnyS8u3c%3Fsi%3DTr77gOKEq3PnA7-k%26controls%3D0%3Fautoplay%3D0%26color%3Dwhite%26cc_lang_pref%3Den%26cc_load_policy%3D0%26enablejsapi%3D1%26frameborder%3D0%26hl%3Den%26rel%3D0%26origin%3Dhttps%253A%252F%252Fwww.meta.com

At Meta, our mission is simple: give people the power to build community and bring the world closer together. And at Reality Labs, we build tools that help people feel connected anytime, anywhere. That’s why we’re working to build the next computing platform that puts people at the center so they can be more present, connected, and empowered in the world.

Γυαλιά Ray-Ban Meta have demonstrated the power of giving people hands-free access to key parts of their digital lives from their physical ones. We can talk to a smart AI assistant, connect with friends, and capture the moments that matter—all without ever having to pull out a phone. These stylish glasses fit seamlessly into our everyday lives, and people absolutely love them.

Yet while Ray-Ban Meta opened up an entirely new category of display-less glasses super-charged by AI, the XR industry has long dreamt of true AR glasses—a product that combines the benefits of a large holographic display and personalized AI assistance in a comfortable, stylish, all-day wearable form factor.

And today, we’ve pushed that dream closer to reality with the unveiling of Orion, which we believe is the most advanced pair of AR glasses ever made. In fact, it might be the most challenging consumer electronics device produced since the smartphone. Orion is the result of breakthrough inventions in virtually every field of modern computing—building on the work we’ve been doing at Reality Labs for the last decade. It’s packed with entirely new technologies, including the most advanced AR display ever assembled and custom silicon that enables powerful AR experiences to run on a pair of glasses using a fraction of the power and weight of an MR headset.

Orion’s input system seamlessly combines voice, eye gaze, and hand tracking with an EMG wristband that lets you swipe, click, and scroll while keeping your arm resting comfortably by your side, letting you stay present in the world and with the people around you as you interact with rich digital content.

Beginning today at Connect and continuing throughout the year, we’re opening up access to our Orion product prototype for Meta employees and select, external audiences so our development team can learn, iterate, and build towards our consumer AR glasses product line, which we plan to begin shipping in the near future.

Why AR Glasses?

There are three primary reasons why AR glasses are key to unlocking the next great leap in human-oriented computing.

  1. They enable digital experiences that are unconstrained by the limits of a smartphone screen. With large holographic displays, you can use the physical world as your canvas, placing 2D and 3D content and experiences anywhere you want.
  2. They seamlessly integrate contextual AI that can sense and understand the world around you in order to anticipate and proactively address your needs.
  3. They’re lightweight and great for both indoor and outdoor use—and they let people see each other’s real face, real eyes, and real expressions.

That’s the north star our industry has been building towards: a product combining the convenience and immediacy of wearables with a large display, high-bandwidth input, and contextualized AI in a form factor that people feel comfortable wearing in their daily lives.

Compact Form Factor, Compounded Challenges

For years, we’ve faced a false choice—either virtual and mixed reality headsets that enable deep, immersive experiences in a bulky form factor or glasses that are ideal for all-day use but don’t offer rich visual apps and experiences given the lack of a large display and corresponding compute power.

But we want it all, without compromises. For years, we’ve been hard at work to take the incredible spatial experiences afforded by VR and MR headsets and miniaturizing the technology necessary to deliver those experiences into a pair of lightweight, stylish glasses. Nailing the form factor, delivering holographic displays, developing compelling AR experiences, and creating new human-computer interaction (HCI) paradigms—and doing it all in one cohesive product—is one of the most difficult challenges our industry has ever faced. It was so challenging that we thought we had less than a 10 percent chance of pulling it off successfully.

Until now.

A Groundbreaking Display in an Unparalleled Form Factor

At approximately 70 degrees, Orion has the largest field of view in the smallest AR glasses form factor to date. That field of view unlocks truly immersive use cases for Orion, from multitasking windows and big-screen entertainment to lifesize holograms of people—all digital content that can seamlessly blend with your view of the physical world.

For Orion, field of view was our holy grail. We were bumping up against the laws of physics and had to bend beams of light in ways they don’t naturally bend—and we had to do it in a power envelope measured in milliwatts.

Instead of glass, we made the lenses from silicon carbide—a novel application for AR glasses. Silicon carbide is incredibly lightweight, it doesn’t result in optical artifacts or stray light, and it has a high refractive index—all optical properties that are key to a large field of view. The waveguides themselves have really intricate and complex nano-scale 3D structures to diffract or spread light in the ways necessary to achieve this field of view. And the projectors are uLEDs—a new type of display technology that’s super small and extremely power efficient.

Orion is unmistakably a pair of glasses in both look and feel—complete with transparent lenses. Unlike MR headsets or other AR glasses today, you can still see each other’s real eyes and expressions, so you can be present and share the experience with the people around you. Dozens of innovations were required to get the industrial design down to a contemporary glasses form factor that you’d be comfortable wearing every day. Orion is a feat of miniaturization—the components are packed down to a fraction of a millimeter. And we managed to embed seven tiny cameras and sensors in the frame rims.

We had to maintain optical precision at one-tenth the thickness of a strand of human hair. And the system can detect tiny amounts of movement—like the frames expanding or contracting in various room temperatures—and then digitally correct for the necessary optical alignment, all within milliseconds. We made the frames out of magnesium—the same material used in F1 race cars and spacecraft—because it’s lightweight yet rigid, so it keeps the optical elements in alignment and efficiently conducts away heat.

Heating and Cooling

Once we cracked the display (no pun intended) and overcame the physics problems, we had to navigate the challenge of really powerful compute alongside really low power consumption and the need for heat dissipation. Unlike today’s MR headsets, you can’t stuff a fan into a pair of glasses. So we had to get creative. A lot of the materials used to cool Orion are similar to those used by NASA to cool satellites in outer space.

We built highly specialized custom silicon that’s extremely power efficient and optimized for our AI, machine perception, and graphics algorithms. We built multiple custom chips and dozens of highly custom silicon IP blocks inside those chips. That lets us take the algorithms necessary for hand and eye tracking as well as simultaneous localization and mapping (SLAM) technology that normally takes hundreds of milliwatts of power—and thus generates a corresponding amount of heat—and shrink it down to just a few dozen milliwatts.

And yes, custom silicon continues to play a critical role in product development at Reality Labs, despite what you may have read elsewhere. 😎

Effortless EMG

Every new computing platform brings with it a paradigm shift in the way we interact with our devices. The invention of the mouse helped pave the way for the graphical user interfaces (GUIs) that dominate our world today, and smartphones didn’t start to really gain traction until the advent of the touch screen. And the same rule holds true for wearables.

We’ve spoken about our work with electromyography, or EMG, for years, driven by our belief that input for AR glasses needs to be fast, convenient, reliable, subtle, and socially acceptable. And now, that work is ready for prime time.

Orion’s input and interaction system seamlessly combines voice, eye gaze, and hand tracking with an EMG wristband that lets you swipe, click, and scroll with ease.

It works—and feels—like magic. Imagine taking a photo on your morning jog with a simple tap of your fingertips or navigating menus with barely perceptible movements of your hands. Our wristband combines a high-performance textile with embedded EMG sensors to detect the electrical signals generated by even the tiniest muscle movements. An on-device ML processor then interprets those EMG signals to produce input events that are transmitted wirelessly to the glasses. The system adapts to you, so it gets better and more capable at recognizing the most subtle gestures over time. And today, we’re sharing more about our support for external research focused on expanding the equity and accessibility potential of EMG wristbands.

Meet the Wireless Compute Puck

True AR glasses must be wireless, and they must also be small.So we built a wireless compute puck for Orion. It takes some of the load off of the glasses, enabling longer battery life and a better form factor complete with low latency.

The glasses run all of the hand tracking, eye tracking, SLAM, and specialized AR world locking graphics algorithms while the app logic runs on the puck to keep the glasses as lightweight and compact as possible.

The puck has dual processors, including one custom designed right here at Meta, and it provides the compute power necessary for low-latency graphics rendering, AI, and some additional machine perception.

And since it’s small and sleek, you can comfortably toss the puck in a bag or your pocket and go about your business—no strings attached.

AR Experiences

Of course, as with any piece of hardware, it’s only as good as the things you can do with it. And while it’s still early days, the experiences afforded by Orion are an exciting glimpse of what’s to come.

https://youtube.com/watch?v=HkdSv3QPhNw%3F%26controls%3D0%3Fautoplay%3D0%26color%3Dwhite%26cc_lang_pref%3Den%26cc_load_policy%3D0%26enablejsapi%3D1%26frameborder%3D0%26hl%3Den%26rel%3D0%26origin%3Dhttps%253A%252F%252Fwww.meta.com

We’ve got our smart assistant, Meta AI, running on Orion. It understands what you’re looking at in the physical world and can help you with useful visualizations. Orion uses the same Llama model that powers AI experiences running on Ray-Ban Meta smart glasses today, plus custom research models to demonstrate potential use cases for future wearables development.

You can take a hands-free video call on the go to catch up with friends and family in real time, and you can stay connected on WhatsApp and Messenger to view and send messages. No need to pull out your phone, unlock it, find the right app, and let your friend know you’re running late for dinner—you can do it all through your glasses.

https://youtube.com/watch?v=el7lUVvu8Bo%3F%26controls%3D0%3Fautoplay%3D0%26color%3Dwhite%26cc_lang_pref%3Den%26cc_load_policy%3D0%26enablejsapi%3D1%26frameborder%3D0%26hl%3Den%26rel%3D0%26origin%3Dhttps%253A%252F%252Fwww.meta.com

You can play shared AR games with family on the other side of the country or with your friend on the other side of the couch. And Orion’s large display lets you multitask with multiple windows to get stuff done without having to lug around your laptop.

The experiences available on Orion today will help chart the roadmap for our consumer AR glasses line in the future. Our teams will continue to iterate and build new immersive social experiences, alongside our developer partners, and we can’t wait to share what comes next.

A Purposeful Product Prototype

While Orion won’t make its way into the hands of consumers, make no mistake: This is not a research prototype. It’s the most polished product prototype we’ve ever developed—and it’s truly representative of something that could ship to consumers. Rather than rushing to put it on shelves, we decided to focus on internal development first, which means we can keep building quickly and continue to push the boundaries of the technology and experiences.

And that means we’ll arrive at an even better consumer product faster.

What Comes Next

Two major obstacles have long stood in the way of mainstream consumer-grade AR glasses: the technological breakthroughs necessary to deliver a large display in a compact glasses form factor and the necessity of useful and compelling AR experiences that can run on those glasses. Orion is a huge milestone, delivering true AR experiences running on reasonably stylish hardware for the very first time.

Now that we’ve shared Orion with the world, we’re focused on a few things:

  • Tuning the AR display quality to make the visuals even sharper
  • Optimizing wherever we can to make the form factor even smaller
  • Building at scale to make them more affordable

In the next few years, you can expect to see new devices from us that build on our R&D efforts. And a number of Orion’s innovations have been extended to our current consumer products today as well as our future product roadmap. We’ve optimized some of our spatial perception algorithms, which are running on both Meta Quest 3S and Orion. While the eye gaze and subtle gestural input system was originally designed for Orion, we plan to use it in future products. And we’re exploring the use of EMG wristbands across future consumer products as well.

Orion isn’t just a window into the future—it’s a look at the very real possibilities within reach today. We built it in our pursuit of what we do best: helping people connect. From Ray-Ban Meta glasses to Orion, we’ve seen the good that can come from letting people stay more present and empowered in the physical world, even while tapping into all the added richness the digital world has to offer.

We believe you shouldn’t have to choose between the two. And with the next computing platform, you won’t have to.


Για περισσότερες πληροφορίες σχετικά με τον Orion, ανατρέξτε σε αυτές τις αναρτήσεις ιστολογίου:

Δημοσιεύθηκε την - Σχολιάστε

Το Compute Puck του Orion: Η ιστορία πίσω από τη συσκευή που βοήθησε να γίνουν δυνατά τα γυαλιά μας AR

Πέρυσι στο Connect, παρουσιάσαμε Orion-το πρώτο μας πραγματικό ζευγάρι γυαλιών AR. Το αποκορύφωμα των το έργο μας στα Reality Labs την τελευταία δεκαετία, το Orion συνδυάζει τα οφέλη μιας μεγάλης ολογραφικής οθόνης και της εξατομικευμένης βοήθειας τεχνητής νοημοσύνης σε έναν άνετο, φορητό παράγοντα μορφής για όλη την ημέρα. Έλαβε κάποια προσοχή για το κορυφαίο στον κλάδο οπτικό πεδίο, τους κυματοδηγούς καρβιδίου πυριτίου, τους προβολείς uLED και πολλά άλλα. Αλλά σήμερα, στρέφουμε την προσοχή μας σε έναν αφανή ήρωα του Orion: το compute puck.

Σχεδιασμένο για να χωράει εύκολα στην τσέπη ή την τσάντα σας, ώστε να μπορείτε να το έχετε μαζί σας σχεδόν παντού καθώς περνάτε την ημέρα σας, το puck αποφορτίζει την επεξεργαστική ισχύ του Orion για να τρέξει τη λογική των εφαρμογών και να επιτρέψει έναν πιο ελκυστικό, μικρότερο παράγοντα μορφής για τα γυαλιά. Συνδέεται ασύρματα με τα γυαλιά και το βραχιολάκι EMG για μια απρόσκοπτη εμπειρία.

Ρυθμίστε το και ξεχάστε το, σωστά;

Αλλά η ιστορία του πακ -ακόμη και ως πρωτότυπο- εμπλέκεται, με ένα δραματικό τόξο που δεν θα μπορούσατε ποτέ να μαντέψετε από την εμφάνισή του.

"Όταν κατασκευάζεις κάτι τέτοιο, αρχίζεις να μπαίνεις στα όρια της φυσικής", εξηγεί ο διευθυντής διαχείρισης προϊόντων Rahul Prasad. "Τα τελευταία 50 χρόνια, ο νόμος του Moore έχει κάνει τα πάντα μικρότερα, ταχύτερα και με χαμηλότερη ισχύ. Το πρόβλημα είναι ότι τώρα αρχίζεις να συναντάς όρια στο πόση θερμότητα μπορείς να αποβάλλεις, πόση μπαταρία μπορείς να συμπιέσεις και πόση απόδοση κεραίας μπορείς να χωρέσεις σε ένα αντικείμενο συγκεκριμένου μεγέθους".

Αν και η εκ των υστέρων γνώση μπορεί να είναι 20/20, οι δυνατότητες του πακ δεν ήταν αμέσως εμφανείς. Όταν είσαι ο πρώτος που κατασκευάζει κάτι, πρέπει να εξερευνήσεις κάθε πιθανότητα - χωρίς να αφήσεις καμία πέτρα αναποδογυρισμένη. Πώς φτιάχνεις κάτι που κάποιοι μπορεί να το θεωρήσουν ανεπιθύμητο αξεσουάρ και όχι κρίσιμο μέρος του πακέτου;

"Ξέραμε ότι το puck ήταν μια επιπλέον συσκευή που ζητούσαμε από τους ανθρώπους να κουβαλούν", σημειώνει ο Διευθυντής Μηχανικής Σχεδίασης Προϊόντων Jared Van Cleave, "οπότε διερευνήσαμε πώς να μετατρέψουμε το σφάλμα σε χαρακτηριστικό".

Τελικά, αυτό το ήθος απέδωσε τα μέγιστα, καθώς ο δίσκος συμπιέζει πολύ υπολογισμό (και ακόμη περισσότερο προσαρμοσμένο πυρίτιο σχεδιασμένο από τη Meta για την τεχνητή νοημοσύνη και τη μηχανική αντίληψη) σε ένα μικρό μέγεθος. Αυτό συνέβαλε καθοριστικά στο να περάσει το Orion από τη σφαίρα της επιστημονικής φαντασίας στην πραγματικότητα.

"Αν δεν είχατε το puck, δεν θα μπορούσατε να ζήσετε τις εμπειρίες που προσφέρει το Orion στον παράγοντα μορφής του -περίοδος", λέει ο Prasad. "Μια καλή εμπειρία AR απαιτεί πραγματικά υψηλές επιδόσεις: υψηλούς ρυθμούς καρέ, εξαιρετικά χαμηλή καθυστέρηση, λεπτομερή διαχείριση ασύρματων δικτύων και ενέργειας κ.λπ. Το puck και τα γυαλιά πρέπει να συν-σχεδιαστούν ώστε να συνεργάζονται πραγματικά στενά, όχι μόνο στο επίπεδο της εφαρμογής, αλλά και στο επίπεδο του λειτουργικού συστήματος, του υλικολογισμικού και του υλικού. Και ακόμη και αν κάποιος συν-σχεδιάσει ένα smartphone για να συνεργαστεί με γυαλιά AR, οι απαιτητικές απαιτήσεις επιδόσεων θα εξαντλούσαν την μπαταρία του τηλεφώνου και θα αφαιρούσαν υπολογιστική χωρητικότητα από τις περιπτώσεις χρήσης του τηλεφώνου. Από την άλλη πλευρά, το puck έχει τη δική του μπαταρία υψηλής χωρητικότητας, SoC υψηλής απόδοσης και έναν προσαρμοσμένο συν-επεξεργαστή AI σχεδιασμένο από την Meta, βελτιστοποιημένο για το Orion".

Φυσικά, το πακ δεν σχεδιάστηκε εν μία νυκτί. Απαιτήθηκαν χρόνια επαναληπτικής εργασίας.

"Δεν ξέραμε πώς οι άνθρωποι θα ήθελαν να αλληλεπιδράσουν με το Orion από την άποψη της εισόδου - δεν υπήρχε τίποτα που θα μπορούσαμε να χρησιμοποιήσουμε στην αγορά", λέει ο Product Manager Matt Resman. "Αν κοιτάξετε τα πρώτα πρωτότυπα γυαλιά μας, ήταν αυτά τα τεράστια ακουστικά που ζύγιζαν τρία ή τέσσερα κιλά. Και όταν προσπαθείς να δημιουργήσεις ένα προϊόν, είναι πολύ δύσκολο να κατανοήσεις την εμπειρία του χρήστη, αν δεν έχεις τον κατάλληλο παράγοντα μορφής. Με το puck, μπορέσαμε να δημιουργήσουμε πρωτότυπα πολύ γρήγορα και να αρχίσουμε να καταλαβαίνουμε πώς θα το χρησιμοποιούσαν οι άνθρωποι".

Με την κωδική ονομασία Omega τις πρώτες μέρες, το puck οραματίστηκε αρχικά από την τότε Oculus Research ως μια ζώνη σχήματος Ω που θα περνούσε γύρω από το λαιμό του χρήστη και θα ήταν συνδεδεμένη με τα γυαλιά...

... μέχρι που κάποιες νέες καινοτομίες από την ομάδα ασύρματης επικοινωνίας της Reality Labs, μεταξύ άλλων, τους επέτρεψαν να κόψουν το καλώδιο. Αυτό επέτρεψε μια πιο φορητή ή φορητή στην τσέπη/στην τσάντα μορφή, η οποία άνοιξε πολλές δυνατότητες.

"Σε εκείνο το σημείο, οι κλήσεις επαυξημένης πραγματικότητας αποτελούσαν ακόμα μια πρωταρχική περίπτωση χρήσης", εξηγεί ο Van Cleave. "Το πακ ήταν το σημείο στο οποίο θα αγκυροβολούνταν τα ολογραφικά βίντεο. Το τοποθετούσατε στο τραπέζι με τον πάγκο των αισθητήρων να σας κοιτάζει, να σας απεικονίζει και στη συνέχεια να προβάλλει από την επιφάνεια του puck αυτόν με τον οποίο μιλούσατε για την κλήση".

"Ο Orion συνδέει τους ανθρώπους και μας φέρνει κοντά", λέει ο Resman. "Μια από τις αρχικές ιδέες για το πακ ήταν να βοηθήσει να δημιουργηθεί αυτή η αίσθηση παρουσίας με άλλους ανθρώπους και να καταστεί δυνατή αυτή η πιο πλούσια μορφή επικοινωνίας".

"Δεν μοιάζει με οτιδήποτε έχετε δει ποτέ - η συσκευή έχει τη δυνατότητα να δημιουργήσει πραγματικά διασκεδαστικές και μοναδικές αλληλεπιδράσεις", προσθέτει ο βιομηχανικός σχεδιαστής Emron Henry. "Η εμπειρία του χρήστη μοιάζει λίγο σαν να απελευθερώνει ένα τζίνι από ένα μπουκάλι, όπου τα ολογράμματα αναδύονται απρόσκοπτα από τη συσκευή και διαλύονται πίσω σε αυτήν".

Όπως μπορείτε πιθανώς να καταλάβετε από τώρα, υπάρχουν περισσότερες δυνατότητες μέσα στο πακ από αυτό που τελικά ενεργοποιήθηκε ως χαρακτηριστικό. Εκτός από τους αισθητήρες και τις κάμερες που θα χρησιμοποιούνταν για κλήσεις AR, το puck διαθέτει απτικά και αισθητήρες 6DOF που θα μπορούσαν να του επιτρέψουν να χρησιμοποιηθεί ως χειριστήριο με παρακολούθηση για την επιλογή και το χειρισμό εικονικών αντικειμένων και την αναπαραγωγή παιχνιδιών AR. Η ομάδα διερεύνησε επίσης την εισαγωγή χωρητικής και δυναμικής αφής, ώστε το puck να μπορεί να χρησιμεύσει ως gamepad όταν κρατιέται τόσο σε κατακόρυφη όσο και σε οριζόντια θέση.

"Συζητούσαμε για την ανάγκη να κουβαλάμε αυτό το επιπλέον πράγμα", λέει ο Van Cleave. "Πώς μπορούμε να το κάνουμε πιο χρήσιμο; Υπήρχε ένα ολόκληρο ρεύμα εργασίας γύρω από αυτό. Και σε κάποιο σημείο, η υπόθεση ήταν ότι το AR gaming θα ήταν αυτή η φονική περίπτωση χρήσης".

Από πολύ νωρίς, ξέραμε ότι έπρεπε να εξερευνήσουμε τις δυνατότητες του πακ να κάνει πράγματα που δεν μπορούν να κάνουν τα τηλέφωνα. Τελικά, καταλήξαμε στη χρήση του οφθαλμικού βλέμματος, του EMG και της παρακολούθησης χεριών για παιχνίδια AR, όπως το Stargazer και το Pong, αλλά δημιουργήσαμε πρωτότυπα διάφορα demo και παιχνίδια που χρησιμοποιούσαν το puck ως χειριστήριο στις πρώτες μέρες του Orion.

0:00 / 0:00

0:00 / 0:00

0:00 / 0:00

Αναπαραστάσεις εξερεύνησης της μπάλας ως χειριστήριο 6DOF για παιχνίδια AR όπως το Pong.

"Επειδή δεν πρόκειται για τηλέφωνο, αυτό μας έδωσε μεγάλη σχεδιαστική ελευθερία", προσθέτει ο Prasad. "Μπορεί να είναι παχύτερο, μπορώ να το κάνω πιο στρογγυλεμένο, ώστε να χωράει άνετα στο χέρι σας ως χειριστήριο. Είναι αρκετά απίστευτο το γεγονός ότι το puck είναι στην πραγματικότητα μικρότερο από το μέσο τηλέφωνο, αλλά είναι πιο ισχυρό από ένα τηλέφωνο, επειδή διαθέτει έναν συν-επεξεργαστή που έχει σχεδιαστεί από τη Meta για την τεχνητή νοημοσύνη και τη μηχανική αντίληψη".

Η ομάδα ασχολήθηκε με το ερώτημα πώς ένα ποιοτικό παιχνίδι AR μπορεί να διαφέρει από ένα παιχνίδι MR ή ένα παιχνίδι κονσόλας. Αυτό σήμαινε τη διερεύνηση διαφορετικών δυνατοτήτων για ένα χειριστήριο παιχνιδιού AR, συμπεριλαμβανομένων των joysticks, της διάταξης φυσικών κουμπιών, των κουμπιών ενεργοποίησης και άλλων.

"Δεν καταλήξαμε να κατασκευάσουμε τίποτα από αυτά τα πράγματα", σημειώνει ο Van Cleave. "Τα κατασκευάσαμε πρωτότυπα, αλλά δεν προχωρήσαμε ποτέ σε πλήρη κατασκευή. Θέλαμε να το κρατήσουμε απλό. Θέλαμε να κάνουμε αυτές τις μαλακές διεπαφές, αλλά όχι να φτιάξουμε φυσικά, μηχανικά κουμπιά".

Και ενώ οι αισθητήρες και τα απτικά δεν ενεργοποιήθηκαν στο τελικό πρωτότυπο του προϊόντος, λειτούργησαν αναπόσπαστα κατά τη διάρκεια της ανάπτυξης, επιτρέποντας στις ομάδες να καταγράφουν σφάλματα, απλά πατώντας μερικές φορές το πάνω μέρος της συσκευής για να ενεργοποιήσουν μια αναφορά σφάλματος.

Καθώς η Τεχνητή Νοημοσύνη άρχισε να αποκτά κεντρική θέση ως βασική περίπτωση χρήσης, ο υπολογισμός τέθηκε σε όλο και πιο έντονη εστίαση. Τελικά, το puck φιλοξενεί την ασύρματη συνδεσιμότητα του Orion, την υπολογιστική ισχύ και τη χωρητικότητα της μπαταρίας - ένα τεράστιο τεχνικό κατόρθωμα από μόνο του - το οποίο συμβάλλει στη μείωση του βάρους και του συντελεστή μορφής των γυαλιών, ενώ παράλληλα διαχέει πολύ περισσότερα θερμικά αέρια λόγω της επιφάνειάς του.

Καθ' όλη τη διάρκεια της εξέλιξής του, ένα πράγμα παρέμεινε το ίδιο: το πακ έχει περισσότερα στοιχεία από όσα βλέπει το ανεκπαίδευτο μάτι. Η υιοθέτηση αντισυμβατικών ιδεών επέτρεψε στις ομάδες μας να εξερευνήσουν, να διευρύνουν τα όρια και να οικοδομήσουν το μέλλον.

"Ορίζουμε μια κατηγορία που δεν υπάρχει ακόμα", σημειώνει ο Henry. "Όπως θα περίμενε κανείς με την Ε&Α, υπήρξαν εκκινήσεις και στάσεις στην πορεία. Πώς θα περιμένουν οι χρήστες να αλληλεπιδράσουν με τα ολογράμματα; Θα προτιμούσαν να χρησιμοποιήσουν ένα τηλεχειριστήριο AR ή αρκεί η παρακολούθηση του χεριού, το βλέμμα των ματιών και το EMG για την είσοδο; Τι αισθάνονται διαισθητικό, χαμηλής τριβής, οικείο και χρήσιμο;"

"Αντί να βλέπουμε το puck ως μια απλή πέτρα, αναρωτηθήκαμε τι άλλο θα μπορούσαμε να προσφέρουμε για να το διαφοροποιήσουμε περαιτέρω από τα τηλέφωνα και να δικαιολογήσουμε γιατί θα θέλατε να το έχετε μαζί σας", παραδέχεται ο Resman. "Μήπως η ακατέργαστη υπολογιστική του ισχύς και ο πρακτικός σχεδιασμός του -που βοήθησε στο να ξεκλειδώσουμε έναν παράγοντα μορφής γυαλιών που μπορείτε ρεαλιστικά να φοράτε όλη μέρα- προσφέρουν τελικά αρκετή αξία; Είναι δουλειά μας να βοηθήσουμε να απαντηθούν αυτά τα ερωτήματα".

"Από πολύ νωρίς, αλλάξαμε ταχύτητα και επικεντρωθήκαμε στο τι μπορούμε να κάνουμε που δεν μπορεί να κάνει ένα τηλέφωνο", προσθέτει ο Van Cleave. "Τα τηλέφωνα πρέπει να έχουν την οθόνη, πρέπει να έχουν τη συγκεκριμένη διάταξη φυσικών κουμπιών που περιμένουν οι χρήστες. Εμείς δεν έχουμε αυτούς τους περιορισμούς. Ο υπολογιστικός μας δίσκος μπορεί να είναι ό,τι θέλουμε να είναι".


Για περισσότερες πληροφορίες σχετικά με τον Orion, ανατρέξτε σε αυτές τις αναρτήσεις ιστολογίου:

Δημοσιεύθηκε την - Σχολιάστε

Orion: AR γυαλιά έχουν φτάσει

TL;DR:

  • Today at Συνδέστε το, Mark Zuckerberg unveiled Orion—our first pair of true AR glasses, previously codenamed Project Nazare.
  • With an industry-leading field of view, silicon carbide lenses, intricate waveguides, uLED projectors, and more, Orion is our most advanced and most polished product prototype to date.
  • Now, we’re continuing product optimizations and lowering costs as we drive toward a scalable consumer device that will revolutionize the way people interact with the world.

“We are building AR glasses.”

Five simple words, spoken five years ago. With them, we planted a flag on our vision of a future where we no longer have to make the false choice between a world of information at our fingertips and the physical world all around us.

And now, five years later, another five words that stand to once again change the game:

We have built AR glasses.

https://youtube.com/watch?v=2CJsnyS8u3c%3Fsi%3DTr77gOKEq3PnA7-k%26controls%3D0%3Fautoplay%3D0%26color%3Dwhite%26cc_lang_pref%3Den%26cc_load_policy%3D0%26enablejsapi%3D1%26frameborder%3D0%26hl%3Den%26rel%3D0%26origin%3Dhttps%253A%252F%252Fwww.meta.com

At Meta, our mission is simple: give people the power to build community and bring the world closer together. And at Reality Labs, we build tools that help people feel connected anytime, anywhere. That’s why we’re working to build the next computing platform that puts people at the center so they can be more present, connected, and empowered in the world.

Γυαλιά Ray-Ban Meta have demonstrated the power of giving people hands-free access to key parts of their digital lives from their physical ones. We can talk to a smart AI assistant, connect with friends, and capture the moments that matter—all without ever having to pull out a phone. These stylish glasses fit seamlessly into our everyday lives, and people absolutely love them.

Yet while Ray-Ban Meta opened up an entirely new category of display-less glasses super-charged by AI, the XR industry has long dreamt of true AR glasses—a product that combines the benefits of a large holographic display and personalized AI assistance in a comfortable, stylish, all-day wearable form factor.

And today, we’ve pushed that dream closer to reality with the unveiling of Orion, which we believe is the most advanced pair of AR glasses ever made. In fact, it might be the most challenging consumer electronics device produced since the smartphone. Orion is the result of breakthrough inventions in virtually every field of modern computing—building on the work we’ve been doing at Reality Labs for the last decade. It’s packed with entirely new technologies, including the most advanced AR display ever assembled and custom silicon that enables powerful AR experiences to run on a pair of glasses using a fraction of the power and weight of an MR headset.

Orion’s input system seamlessly combines voice, eye gaze, and hand tracking with an EMG wristband that lets you swipe, click, and scroll while keeping your arm resting comfortably by your side, letting you stay present in the world and with the people around you as you interact with rich digital content.

Beginning today at Connect and continuing throughout the year, we’re opening up access to our Orion product prototype for Meta employees and select, external audiences so our development team can learn, iterate, and build towards our consumer AR glasses product line, which we plan to begin shipping in the near future.

Why AR Glasses?

There are three primary reasons why AR glasses are key to unlocking the next great leap in human-oriented computing.

  1. They enable digital experiences that are unconstrained by the limits of a smartphone screen. With large holographic displays, you can use the physical world as your canvas, placing 2D and 3D content and experiences anywhere you want.
  2. They seamlessly integrate contextual AI that can sense and understand the world around you in order to anticipate and proactively address your needs.
  3. They’re lightweight and great for both indoor and outdoor use—and they let people see each other’s real face, real eyes, and real expressions.

That’s the north star our industry has been building towards: a product combining the convenience and immediacy of wearables with a large display, high-bandwidth input, and contextualized AI in a form factor that people feel comfortable wearing in their daily lives.

Compact Form Factor, Compounded Challenges

For years, we’ve faced a false choice—either virtual and mixed reality headsets that enable deep, immersive experiences in a bulky form factor or glasses that are ideal for all-day use but don’t offer rich visual apps and experiences given the lack of a large display and corresponding compute power.

But we want it all, without compromises. For years, we’ve been hard at work to take the incredible spatial experiences afforded by VR and MR headsets and miniaturizing the technology necessary to deliver those experiences into a pair of lightweight, stylish glasses. Nailing the form factor, delivering holographic displays, developing compelling AR experiences, and creating new human-computer interaction (HCI) paradigms—and doing it all in one cohesive product—is one of the most difficult challenges our industry has ever faced. It was so challenging that we thought we had less than a 10 percent chance of pulling it off successfully.

Until now.

A Groundbreaking Display in an Unparalleled Form Factor

At approximately 70 degrees, Orion has the largest field of view in the smallest AR glasses form factor to date. That field of view unlocks truly immersive use cases for Orion, from multitasking windows and big-screen entertainment to lifesize holograms of people—all digital content that can seamlessly blend with your view of the physical world.

For Orion, field of view was our holy grail. We were bumping up against the laws of physics and had to bend beams of light in ways they don’t naturally bend—and we had to do it in a power envelope measured in milliwatts.

Instead of glass, we made the lenses from silicon carbide—a novel application for AR glasses. Silicon carbide is incredibly lightweight, it doesn’t result in optical artifacts or stray light, and it has a high refractive index—all optical properties that are key to a large field of view. The waveguides themselves have really intricate and complex nano-scale 3D structures to diffract or spread light in the ways necessary to achieve this field of view. And the projectors are uLEDs—a new type of display technology that’s super small and extremely power efficient.

Orion is unmistakably a pair of glasses in both look and feel—complete with transparent lenses. Unlike MR headsets or other AR glasses today, you can still see each other’s real eyes and expressions, so you can be present and share the experience with the people around you. Dozens of innovations were required to get the industrial design down to a contemporary glasses form factor that you’d be comfortable wearing every day. Orion is a feat of miniaturization—the components are packed down to a fraction of a millimeter. And we managed to embed seven tiny cameras and sensors in the frame rims.

We had to maintain optical precision at one-tenth the thickness of a strand of human hair. And the system can detect tiny amounts of movement—like the frames expanding or contracting in various room temperatures—and then digitally correct for the necessary optical alignment, all within milliseconds. We made the frames out of magnesium—the same material used in F1 race cars and spacecraft—because it’s lightweight yet rigid, so it keeps the optical elements in alignment and efficiently conducts away heat.

Heating and Cooling

Once we cracked the display (no pun intended) and overcame the physics problems, we had to navigate the challenge of really powerful compute alongside really low power consumption and the need for heat dissipation. Unlike today’s MR headsets, you can’t stuff a fan into a pair of glasses. So we had to get creative. A lot of the materials used to cool Orion are similar to those used by NASA to cool satellites in outer space.

We built highly specialized custom silicon that’s extremely power efficient and optimized for our AI, machine perception, and graphics algorithms. We built multiple custom chips and dozens of highly custom silicon IP blocks inside those chips. That lets us take the algorithms necessary for hand and eye tracking as well as simultaneous localization and mapping (SLAM) technology that normally takes hundreds of milliwatts of power—and thus generates a corresponding amount of heat—and shrink it down to just a few dozen milliwatts.

And yes, custom silicon continues to play a critical role in product development at Reality Labs, despite what you may have read elsewhere. 😎

Effortless EMG

Every new computing platform brings with it a paradigm shift in the way we interact with our devices. The invention of the mouse helped pave the way for the graphical user interfaces (GUIs) that dominate our world today, and smartphones didn’t start to really gain traction until the advent of the touch screen. And the same rule holds true for wearables.

We’ve spoken about our work with electromyography, or EMG, for years, driven by our belief that input for AR glasses needs to be fast, convenient, reliable, subtle, and socially acceptable. And now, that work is ready for prime time.

Orion’s input and interaction system seamlessly combines voice, eye gaze, and hand tracking with an EMG wristband that lets you swipe, click, and scroll with ease.

It works—and feels—like magic. Imagine taking a photo on your morning jog with a simple tap of your fingertips or navigating menus with barely perceptible movements of your hands. Our wristband combines a high-performance textile with embedded EMG sensors to detect the electrical signals generated by even the tiniest muscle movements. An on-device ML processor then interprets those EMG signals to produce input events that are transmitted wirelessly to the glasses. The system adapts to you, so it gets better and more capable at recognizing the most subtle gestures over time. And today, we’re sharing more about our support for external research focused on expanding the equity and accessibility potential of EMG wristbands.

Meet the Wireless Compute Puck

True AR glasses must be wireless, and they must also be small.So we built a wireless compute puck for Orion. It takes some of the load off of the glasses, enabling longer battery life and a better form factor complete with low latency.

The glasses run all of the hand tracking, eye tracking, SLAM, and specialized AR world locking graphics algorithms while the app logic runs on the puck to keep the glasses as lightweight and compact as possible.

The puck has dual processors, including one custom designed right here at Meta, and it provides the compute power necessary for low-latency graphics rendering, AI, and some additional machine perception.

And since it’s small and sleek, you can comfortably toss the puck in a bag or your pocket and go about your business—no strings attached.

AR Experiences

Of course, as with any piece of hardware, it’s only as good as the things you can do with it. And while it’s still early days, the experiences afforded by Orion are an exciting glimpse of what’s to come.

https://youtube.com/watch?v=HkdSv3QPhNw%3F%26controls%3D0%3Fautoplay%3D0%26color%3Dwhite%26cc_lang_pref%3Den%26cc_load_policy%3D0%26enablejsapi%3D1%26frameborder%3D0%26hl%3Den%26rel%3D0%26origin%3Dhttps%253A%252F%252Fwww.meta.com

We’ve got our smart assistant, Meta AI, running on Orion. It understands what you’re looking at in the physical world and can help you with useful visualizations. Orion uses the same Llama model that powers AI experiences running on Ray-Ban Meta smart glasses today, plus custom research models to demonstrate potential use cases for future wearables development.

You can take a hands-free video call on the go to catch up with friends and family in real time, and you can stay connected on WhatsApp and Messenger to view and send messages. No need to pull out your phone, unlock it, find the right app, and let your friend know you’re running late for dinner—you can do it all through your glasses.

https://youtube.com/watch?v=el7lUVvu8Bo%3F%26controls%3D0%3Fautoplay%3D0%26color%3Dwhite%26cc_lang_pref%3Den%26cc_load_policy%3D0%26enablejsapi%3D1%26frameborder%3D0%26hl%3Den%26rel%3D0%26origin%3Dhttps%253A%252F%252Fwww.meta.com

You can play shared AR games with family on the other side of the country or with your friend on the other side of the couch. And Orion’s large display lets you multitask with multiple windows to get stuff done without having to lug around your laptop.

The experiences available on Orion today will help chart the roadmap for our consumer AR glasses line in the future. Our teams will continue to iterate and build new immersive social experiences, alongside our developer partners, and we can’t wait to share what comes next.

A Purposeful Product Prototype

While Orion won’t make its way into the hands of consumers, make no mistake: This is not a research prototype. It’s the most polished product prototype we’ve ever developed—and it’s truly representative of something that could ship to consumers. Rather than rushing to put it on shelves, we decided to focus on internal development first, which means we can keep building quickly and continue to push the boundaries of the technology and experiences.

And that means we’ll arrive at an even better consumer product faster.

What Comes Next

Two major obstacles have long stood in the way of mainstream consumer-grade AR glasses: the technological breakthroughs necessary to deliver a large display in a compact glasses form factor and the necessity of useful and compelling AR experiences that can run on those glasses. Orion is a huge milestone, delivering true AR experiences running on reasonably stylish hardware for the very first time.

Now that we’ve shared Orion with the world, we’re focused on a few things:

  • Tuning the AR display quality to make the visuals even sharper
  • Optimizing wherever we can to make the form factor even smaller
  • Building at scale to make them more affordable

In the next few years, you can expect to see new devices from us that build on our R&D efforts. And a number of Orion’s innovations have been extended to our current consumer products today as well as our future product roadmap. We’ve optimized some of our spatial perception algorithms, which are running on both Meta Quest 3S and Orion. While the eye gaze and subtle gestural input system was originally designed for Orion, we plan to use it in future products. And we’re exploring the use of EMG wristbands across future consumer products as well.

Orion isn’t just a window into the future—it’s a look at the very real possibilities within reach today. We built it in our pursuit of what we do best: helping people connect. From Ray-Ban Meta glasses to Orion, we’ve seen the good that can come from letting people stay more present and empowered in the physical world, even while tapping into all the added richness the digital world has to offer.

We believe you shouldn’t have to choose between the two. And with the next computing platform, you won’t have to.


Για περισσότερες πληροφορίες σχετικά με τον Orion, ανατρέξτε σε αυτές τις αναρτήσεις ιστολογίου:

Δημοσιεύθηκε την - Σχολιάστε

EgoMimic: Georgia Tech PhD student uses Project Aria Research Glasses to help train humanoid robots

Today, we’re highlighting new research from Georgia Tech that helps train robots to perform basic everyday tasks using egocentric recordings from wearers of Meta’s Project Aria research glasses. Check out the video below, read the full story, or apply for your own Project Aria Research Kit.

Imagine having help completing everyday tasks in your home such as doing the laundry, washing dishes, and making repairs. We already use tools to help with these tasks, like washing machines, dishwashers, and electric drills. But what if you could have an even more powerful and flexible tool in the form of a humanoid robot that could learn from you and accelerate any number of physical projects on your to-do list?

Even if you had the available hardware system, teaching a robot to do everyday tasks can only be achieved through a slow and clunky data collection method called robot teleoperation. Until now. By using the Project Aria Research Kit, Professor Danfei Xu and the Robotic Learning and Reasoning Lab at Georgia Tech use the egocentric sensors on Aria glasses to create what they call “human data” for tasks that they want a humanoid robot to replicate. They use human data to dramatically reduce the amount of robot teleoperation data needed to train a robot’s policy—a breakthrough that could some day make humanoid robots capable of learning any number of tasks a human could demonstrate.

Kareer teleoperates the robot to capture co-training data for EgoMimic. Teleoperation can be difficult to scale and require significant human effort.

“Traditionally, collecting data for robotics means creating demonstration data,” says Simar Kareer, a PhD student in Georgia Tech’s School of Interactive Computing. “You operate the robot’s joints with a controller to move it and achieve the task you want, and you do this hundreds of times while recording sensor data, then train your models. This is slow and difficult. The only way to break that cycle is to detach the data collection from the robot itself.”

Today, robot policy models are trained with large amounts of targeted demonstration data specific to each narrow task at a high cost. Kareer hypothesizes that passively collected data from many researchers, like the data captured by Aria glasses, could instead be used to enable data creation for a much broader set of tasks to create more generally useful robots in the future.

Inspired by Project Aria και Ego-Exo4D which includes a massive egocentric dataset of over 3K hours of video recordings of daily-life activities, Kareer developed EgoMimic, a new algorithmic framework that utilizes human data and robot data for humanoid robot development.

“When I looked at Ego4D, I saw a dataset that’s the same as all the large robot datasets we’re trying to collect, except it’s with humans,” Kareer explains. “You just wear a pair of glasses, and you go do things. It doesn’t need to come from the robot. It should come from something more scalable and passively generated, which is us.” In Kareer’s research, Aria glasses were used to create human data for co-training the EgoMimic framework.

Kareer creates co-training human data by recording with Aria glasses while folding a t-shirt.

Aria glasses aren’t just used for human data collection in Georgia Tech’s research. They’re also used as an integral component of the robot’s real-time operation setup. Aria glasses are mounted to their humanoid robot platform just like a pair of eyes and serve as an integrated sensor package that enables the robot to perceive its environment in real time. The Aria Client SDK is utilized to stream Aria’s sensor data directly into the robot’s policy, running on an attached PC, which in turn controls the robot’s actuation. Using Aria glasses for both the data collection and the real-time perception pipeline minimizes the domain gap between the human demonstrator and the robot, paving the way for scaled human data generation for future robotics task training.

Aria glasses mounted to the top of the robot provide the system with sensor data that allows the robot to perceive and interact with the space.

Thanks to EgoMimic, Kareer achieved a 400% increase in his robot’s performance across various tasks vs previous methods with just 90 minutes of Aria recordings. The robot was also able to successfully perform these tasks in previously unseen environments.

In the future, humanoid robots could be trained at scale using egocentric data in order to perform a variety of tasks in the same way humans do.

“We look at Aria as an investment in the research community,” says James Fort, a Reality Labs Research Product Manager at Meta. “The more that the egocentric research community standardizes, the more researchers will be able to collaborate. It’s really through scaling with the community like this that we can start to solve bigger problems around how things are going to work in the future.”

Kareer will present his paper on EgoMimic at the 2025 IEEE Engineers’ International Conference on Robotics and Automation (ICRA) in Atlanta.

Δημοσιεύθηκε την - Σχολιάστε

Introducing Aria Gen 2: Unlocking New Research in Machine Perception, Contextual AI, Robotics, and More

Since its launch in 2020, Project Aria has enabled researchers across the world to advance the state of the art in machine perception and AI, through access to cutting-edge research hardware and open-source datasets, models, and tooling. Today, we’re excited to announce the next step in this journey: the introduction of Aria Gen 2 glasses. This next generation of hardware will unlock new possibilities across a wide range of research areas including machine perception, egocentric and contextual AI, and robotics.

0:00 / 0:00

For researchers looking to explore how AI systems can better understand the world from a human perspective, Aria Gen 2 glasses add a new set of capabilities to the Aria platform. They include a number of advances not found on any other device available today, and access to these breakthrough technologies will enable researchers to push the boundaries of what’s possible.

Compared to Aria Gen 1, Aria Gen 2’s unique value proposition includes:

  • State-of-the-art sensor suite: The upgraded sensor suite features an RGB camera, 6DOF SLAM cameras, eye tracking cameras, spatial microphones, IMUs, barometer, magnetometer, and GNSS. Compared to its predecessor, Aria Gen 1, the new generation introduces two innovative sensors embedded in the nosepad: a PPG sensor for measuring heart rate and a contact microphone to distinguish the wearer’s voice from that of bystanders.
  • Ultra low-power and on-device machine perception: SLAM, eye tracking, hand tracking, and speech recognition are all processed on-device using Meta’s custom silicon.
  • All-day usability: Aria Gen 2 glasses are capable of six to eight hours of continuous use, weigh about 75 grams, and have foldable arms for easy portability.
  • Interaction through audio: Users get audio feedback via best-in-class open-ear force-canceling speakers, enabling user-in-the-loop system prototyping.

Our decade-long journey to create the next computing platform has led to the development of these critical technologies. At Meta, teams at Reality Labs Research and the FAIR AI lab will use them to advance our long-term research vision. Making them available to academic and commercial research labs through Project Aria will further advance open research and public understanding of a key set of technologies that we believe will help shape the future of computing and AI.

The open research enabled by Project Aria since 2020 has already led to important work, including the creation of open-source tools in wide use across academia and industry. The Ego-Exo4D dataset, collected using the first generation of Aria glasses, has become a foundational tool across modern computer vision and the growing field of robotics. Researchers at Georgia Tech recently showed how the Aria Research Kit can help humanoid robots learn to assist people in the home, while teams at BMW used it to explore how to integrate augmented and virtual reality systems into smart vehicles.

And Aria is also enabling the development of new technologies for accessibility. The first-generation Aria glasses were utilized by Carnegie Mellon University in their NavCog project, which aimed to build technologies to assist blind and low-vision individuals with indoor navigation. Building on this foundation, the Aria Gen 2 glasses are now being leveraged by Envision, a company dedicated to creating solutions for people who are blind or have low vision. Envision is exploring the integration of its Ally AI assistant and spatial audio using the latest Aria Gen 2 glasses to enhance indoor navigation and accessibility experiences.

0:00 / 0:00

Envision used the on-device SLAM capabilities of Aria Gen 2, along with spatial audio features via onboard speakers, to assist blind and low-vision individuals seamlessly navigate indoor environments. This innovative use of the technologies, which is still in the exploratory and research phase, exemplifies how researchers can leverage Aria Gen 2 glasses for prototyping AI experiences based on egocentric observations. The advanced sensors and on-device machine perception capabilities, including SLAM, eye tracking, hand tracking, and audio interactions, also make them ideal for data collection for research and robotics applications.

Over the coming months, we’ll share more details about the timing of device availability to partners. Researchers interested in accessing Aria Gen 2 can sign up to receive updates. We’re excited to see how researchers will leverage Aria Gen 2 to pave the way for future innovations that will shape the next computing platform.

Δημοσιεύθηκε την - Σχολιάστε

Ray-Ban Meta - Στυλ και τεχνολογία σε ένα πλαίσιο

Ray-Ban has been an iconic name in eyewear fashion for decades. But in collaboration with Meta (formerly Facebook), they’ve launched a product that pushes boundaries – the Ray-Ban Meta smart glasses. This fusion of classic design with modern technology offers a unique experience for both fashion enthusiasts and tech lovers.

At First Glance: Classic with a Twist

At first look, they seem like classic Ray-Bans – elegant black frames, premium craftsmanship, and a stylish eco-leather case. But take a closer look and you’ll realize they hide more than just style – they’re glasses with a built-in camera, microphone, speakers, and smartphone connectivity.

What Can Ray-Ban Meta Do?

  • Video and photo capture: A discreet camera on the frame lets you capture the world from your point of view – hands-free.
  • Music and calls: With built-in directional speakers and microphones, you can take calls or listen to music without earbuds.
  • Sync with the Meta View app: Everything you record syncs automatically to the app, ready to be shared on your socials.
  • Voice control: Thanks to voice assistant support, you can control your glasses with simple voice commands – ideal for driving, sports, or on-the-go use.

Who Are They For?

Ray-Ban Meta glasses are perfect for content creators, influencers, travelers, or anyone who wants to capture moments without pulling out their phone. They’re also great for business use – ideal for quick documentation, live streams, or mobile video calls.

Final Verdict: Smart Elegance Without Compromise

If you’re looking for eyewear that blends classic fashion with top-tier technology, Ray-Ban Meta is the clear choice. They’re not just a stylish accessory – they’re smart glasses that redefine how we see and capture the world around us.

Δημοσιεύθηκε την - Σχολιάστε

Το Meta View είναι τώρα η εφαρμογή Meta AI: Τι είναι καινούργιο & τι παραμένει το ίδιο

Μια σημαντική ενημέρωση για Γυαλιά Ray-Ban Meta ιδιοκτήτες σήμερα: Η εφαρμογή Meta View είναι τώρα η εφαρμογή Meta AI στο iOS και Android. Μαζί με το νέο όνομα, η εφαρμογή AI glasses companion app αναβαθμίζεται, με νέα χαρακτηριστικά που θα κάνουν την εμπειρία σας πιο διασκεδαστική, χρήσιμη και προσωπική από ποτέ. Είμαστε εδώ για να σας ξεναγήσουμε στις αλλαγές. Αλλά να είστε σίγουροι ότι τα βασικά χαρακτηριστικά που έχετε γνωρίσει θα παραμείνουν τα ίδια.

Τα γυαλιά έχουν αναδειχθεί ως η πιο συναρπαστική νέα κατηγορία υλικού της εποχής της τεχνητής νοημοσύνης, με τα γυαλιά Ray-Ban Meta να πρωτοστατούν στον καθορισμό των δυνατοτήτων. Είμαστε ενθουσιασμένοι που φέρνουμε τις ίδιες φωνητικές εμπειρίες που έκαναν τα γυαλιά μας τόσο δημοφιλή σε περισσότερες επιφάνειες, όπως η εφαρμογή Meta AI.

Φιλικό & οικείο

Ενώ η εφαρμογή Meta AI έχει νέο όνομα και εμφάνιση, εξακολουθεί να είναι η συνοδευτική εφαρμογή για τα γυαλιά Ray-Ban Meta που γνωρίζετε και αγαπάτε. Οι υπάρχουσες φωτογραφίες και τα βίντεο που έχετε τραβήξει με τα γυαλιά σας, οι ρυθμίσεις της συσκευής και οι λειτουργίες που γνωρίζετε από το Meta View εξακολουθούν να είναι διαθέσιμες στη νέα εφαρμογή. Επίσης, μεταφέρουμε αυτόματα τα υπάρχοντα μέσα και τις ρυθμίσεις της συσκευής σας από το Meta View στην εφαρμογή Meta AI, ώστε να μη χάσετε τίποτα. Και όπως ακριβώς παρουσιάσαμε στην τελευταία μας ενημέρωση, μπορείτε να χρησιμοποιήσετε την εφαρμογή Meta AI για να αντιστοιχίσετε και να διαχειριστείτε έως και επτά ζευγάρια γυαλιών.

Μια συνοδευτική εφαρμογή "Όλα σε ένα

Νέα γυαλιά Ray-Ban Meta; Η εφαρμογή Meta AI είναι ο σύντροφός σας για να ρυθμίσετε και να διαχειριστείτε τη συσκευή σας. Μπορείτε να εισάγετε, να προβάλλετε και να μοιράζεστε τις φωτογραφίες και τα βίντεο που τραβήξατε με τα γυαλιά AI απευθείας από την εφαρμογή. Θα χρησιμοποιήσετε επίσης την εφαρμογή Meta AI για να διαμορφώσετε τις ρυθμίσεις σας, να ενημερώσετε το λογισμικό σας και να διαχειριστείτε τις επιλογές απορρήτου. Μπορείτε να ρυθμίσετε φωνητικές εντολές από την εφαρμογή, ώστε να μπορείτε να χρησιμοποιείτε τα γυαλιά σας hands-free, και να προσαρμόζετε ακόμη και τις ρυθμίσεις της κάμεράς σας.

Με την ενσωμάτωση του Meta AI και όλων όσων προσφέρει στην εφαρμογή συνοδευτικού προγράμματος, βελτιώνουμε την εμπειρία σας με τα γυαλιά Ray-Ban Meta. Μπορείτε να γίνετε δημιουργικοί με τις φωτογραφίες από τα γυαλιά σας απευθείας στην εφαρμογή Meta AI - αφού εισαγάγετε τις φωτογραφίες σας, ζητήστε από το Meta AI στην εφαρμογή να προσθέσει, να αφαιρέσει ή να αλλάξει μέρη της εικόνας.

Εξετάζετε μια νέα πόλη; Ζητήστε από το Meta AI να σας πει περισσότερα για ένα τοπικό αξιοθέατο. Μπορείτε να αποκτήσετε πρόσβαση και να διαχειριστείτε τις αποθηκευμένες αναμνήσεις για να βελτιώσετε τις εξατομικευμένες απαντήσεις από το Meta AI στην εφαρμογή και τα γυαλιά σας με την πάροδο του χρόνου. Μπορείτε επίσης να ξεκινήσετε μια συνομιλία με το Meta AI στα γυαλιά σας Ray-Ban Meta και, στη συνέχεια, να έχετε πρόσβαση στην καρτέλα ιστορικού σας από την εφαρμογή για να συνεχίσετε από εκεί που σταματήσατε.

Νέα και αξιοσημείωτα χαρακτηριστικά στην εφαρμογή

Η εφαρμογή Meta AI σχεδιάστηκε για να σας βοηθήσει να βρείτε έμπνευση και να αξιοποιήσετε στο έπακρο τα γυαλιά σας.

Το νέο Ανακαλύψτε την τροφοδοσία σας επιτρέπει να δείτε πώς άλλοι άνθρωποι χρησιμοποιούν την ΤΝ για να κάνουν πράγματα. Είναι ένας πολύ καλός τρόπος για να βρείτε νέες ιδέες. Ανακυκλώστε μια έξυπνη προτροπή ή αναμείξτε την για να την κάνετε δική σας. Και όπως πάντα, εσείς έχετε τον έλεγχο: Οι συνομιλίες σας με το Meta AI δεν είναι δημόσιες, εκτός αν επιλέξετε να τις μοιραστείτε στην τροφοδοσία.

Αν και το να μιλάτε με την τεχνητή νοημοσύνη χρησιμοποιώντας τη φωνή σας δεν είναι κάτι καινούργιο, χρησιμοποιούμε Λάμα 4 στην εφαρμογή Meta AI για να σας προσφέρει απαντήσεις που μοιάζουν πιο προσωπικές, σχετικές και διαλογικές. Η εμπειρία ενσωματώνεται με άλλες λειτουργίες, όπως η δημιουργία και η επεξεργασία εικόνων, οι οποίες μπορούν πλέον να γίνονται όλες μέσω ενός συνομιλία φωνής ή κειμένου με τον βοηθό τεχνητής νοημοσύνης στην εφαρμογή.

Έχουμε επίσης συμπεριλάβει ένα demo μιας φωνητικής εμπειρίας που δημιουργούμε και είναι τεχνολογία ομιλίας πλήρους διπλής όψης, την οποία μπορείτε να ενεργοποιήσετε και να απενεργοποιήσετε για να δοκιμάσετε στην εφαρμογή Meta AI. Αυτή η λειτουργία δεν έχει πρόσβαση σε πληροφορίες σε πραγματικό χρόνο, οπότε ενδέχεται να αντιμετωπίσετε τεχνικά ζητήματα ή ασυνέπειες. Θα συνεχίσουμε να συλλέγουμε σχόλια για να το βελτιώσουμε με την πάροδο του χρόνου.

Και για να διευκολύνετε το multitasking, μπορείτε επίσης να συνεχίσετε να χρησιμοποιείτε τις φωνητικές λειτουργίες του Meta AI ενώ κάνετε άλλα πράγματα στο τηλέφωνό σας, με ένα ορατό εικονίδιο που σας ενημερώνει όταν το μικρόφωνο είναι σε χρήση.

Οι φωνητικές συνομιλίες στην εφαρμογή Meta AI, συμπεριλαμβανομένης της επίδειξης πλήρους αμφίδρομης λειτουργίας, είναι διαθέσιμες στις ΗΠΑ, τον Καναδά, την Αυστραλία και τη Νέα Ζηλανδία για να ξεκινήσουν.* Για να μάθετε περισσότερα σχετικά με το πώς μπορείτε να διαχειριστείτε την εμπειρία σας στην εφαρμογή Meta AI και να εναλλάσσεστε μεταξύ των λειτουργιών, επισκεφθείτε την ιστοσελίδα μας. κέντρο βοήθειας.

Νοημοσύνη για εσάς

Η εφαρμογή Meta AI χρησιμοποιεί το Llama 4 για να σας βοηθήσει να λύσετε προβλήματα, να κινηθείτε ευκολότερα μέσα στην ημέρα σας και να κατανοήσετε καλύτερα τον κόσμο γύρω σας. Και με τη δυνατότητα αναζήτησης σε όλο τον ιστό, μπορεί να σας βοηθήσει να λάβετε συστάσεις, να εμβαθύνετε σε ένα θέμα και να παραμείνετε συνδεδεμένοι με τους φίλους και την οικογένειά σας.

Χρησιμοποιούμε την εμπειρία δεκαετιών μας εξατομίκευση των εμπειριών των ανθρώπων στις πλατφόρμες μας για να κάνουμε το Meta AI πιο προσωπικό. Μπορείτε να πείτε στο Meta AI να θυμάται συγκεκριμένες λεπτομέρειες για εσάς (όπως ότι σας αρέσουν τα ταξίδια και η εκμάθηση νέων γλωσσών), και μπορεί να εντοπίζει σημαντικές λεπτομέρειες από τις ενδείξεις του περιβάλλοντος. Το Meta AI παρέχει επίσης πιο σχετικές απαντήσεις στις ερωτήσεις σας, αντλώντας από πληροφορίες που έχετε ήδη επιλέξει να μοιραστείτε στο Facebook και το Instagram (αν έχουν προστεθεί στο ίδιο Κέντρο λογαριασμών), όπως τα στοιχεία του προφίλ και το περιεχόμενο που σας αρέσει ή με το οποίο συμμετέχετε. Οι εξατομικευμένες απαντήσεις είναι διαθέσιμες από σήμερα στις ΗΠΑ και τον Καναδά.

Μόλις αρχίσαμε

Αυτή είναι η πρώτη έκδοση της εφαρμογής Meta AI και σκοπεύουμε να την επαναλάβουμε και να τη βελτιώσουμε με την πάροδο του χρόνου. Η Meta AI έχει δημιουργηθεί για να σας γνωρίσει, ώστε οι απαντήσεις της να είναι πιο χρήσιμες. Είναι εύκολο να μιλήσετε μαζί του, ώστε να είναι πιο απρόσκοπτη και φυσική η αλληλεπίδραση μαζί του. Και είναι πιο συνδεδεμένη, ώστε να μπορεί να μοιράζεται πράγματα από τους ανθρώπους και τα μέρη που σας ενδιαφέρουν.

Δοκιμάστε το σήμερα στο iOS ή Android, και μάθετε περισσότερα για το νέο και βελτιωμένο Meta AI.

*Οι φωνητικές συνομιλίες στη Νέα Ζηλανδία είναι διαθέσιμες μόνο στην εφαρμογή, όχι στα γυαλιά Ray-Ban Meta.

Δημοσιεύθηκε την - Σχολιάστε

Fashion Forward: Amelia Gray εντοπίστηκε με γυαλιά Ray-Ban Meta στο κόκκινο χαλί του Met Gala

High couture took center stage last night on the first Monday in May. And you may have spotted some familiar fashion-forward tech on the proverbial red carpet. As part of her custom lookAmelia Gray sported a pair of Γυαλιά Ray-Ban Meta.

CREDIT: Getty Images Entertainment

The AI glasses—Shiny Black Skyler frames with Polarized Green sun lenses—were the perfect accessory to capture photos and videos throughout the day while also helping her get ready. From taking calls hands-free and listening to music or a podcast to getting answers from Meta AI on the fly, Ray-Ban Meta glasses make any pre-event routine anything but.

This isn’t the first time our AI glasses have gone high fashion. Earlier this year, Gray debuted our sold-out Ray-Ban Meta x Coperni Limited Edition frames on the runway at Paris Fashion Week. Ray-Ban Meta glasses are fashion’s most useful tech-accessory, gaining traction among editors, stylists, and models from the pages of magazines to the runway—and now on the red carpet of fashion’s biggest night of the year.

“I’m still radiating from the night!” said Amelia Gray. “And to know that I will be able to relive so many of those special moments from my own perspective is truly surreal. It’s an entirely new way to experience and share these once-in-a-lifetime moments, and I couldn’t be more excited to be part of Ray-Ban Meta’s Met Gala debut. ”

Inside the event, Gray was joined by Charli XCX, Jeff Goldblum, Jenna Ortega, Kim Kardashian, Nicki Minaj, Rosé, Shaboozey, and Taraji P. Henson as guests at Instagram’s table along with Adam Mosseri and Eva Chen.

Photographer: Quil Lemons

And off the carpet, Instagram και Threads hosted a number of icons who are driving the cultural conversation forward at The Mark Hotel for our 5th annual Watch Party. More than 50 creators and pop culture and fashion commentators gathered to watch the red carpet arrivals and share real-time looks on Instagram and Threads.

CREDIT: Huy Luong

Styles may come and go, but fashion is forever. And the future of AI glasses is looking bright.

Δημοσιεύθηκε την - Σχολιάστε

New Ray-Ban | Meta Smart Glasses Styles and Meta AI Updates

00:00

00:30

Ενδεικτικά στοιχεία

  • We’re expanding the Ray-Ban Meta smart glasses collection with new styles.
  • We’re adding video calling with WhatsApp and Messenger to share your view on a video call. 
  • We’re rolling out Meta AI with Vision, so you can ask your glasses about what you’re seeing and get helpful information — completely hands-free. 

Our second-generation smart glasses, in partnership with EssilorLuxottica, have been flying off the shelves — they’re selling out faster than we can make them. And just in time for sunglasses season, we’re expanding the Ray-Ban Meta smart glasses collection with new styles designed to fit more face shapes so you can find the perfect pair. We’re also adding new features, including updates to Meta AI, to make the glasses even more useful.

A Frame for Every Face

Looking for a vintage vibe? Our new Skyler frames feature a cat eye design inspired by an era of iconic jet-set style, designed to suit smaller faces. 

We’re also adding a new low bridge option for our Headliner frames. If your glasses tend to slide down your nose, sit too low on your face or press on your cheeks, this is the fit for you. 

There are hundreds of different custom frame and lens combinations on the Ray-Ban Remix platform, so you can mix and match to make the glasses your own on ray-ban.com. And our new styles are designed to be prescription lens compatible. Skyler and the new Headliner low bridge fit are available for pre-order now on meta.com και ray-ban.com. These new styles are available in 15 countries, including the US, Canada, Australia, and throughout Europe.

Photo of 6 different pairs of Ray-Ban Meta frames

We’re also introducing the first limited-edition Ray-Ban Meta smart glasses in an exclusive Scuderia Ferrari colorway for Miami 2024. Ray-Ban Meta for Scuderia Limited Edition brings together the legacy of Ferrari, timeless Ray-Ban design and cutting-edge tech from Meta, available April 24, 2024.

Share Your View on a Video Call

From a breathtaking vista on a hike to experiencing your kid’s first steps, there are some moments in life that are just meant to be shared. That’s why we’re adding the ability to share your view on a video call via WhatsApp and Messenger, completely hands-free. 

Image showing hands-free video calling on Ray-Ban Meta smart glasses - a father cooking with his child

At the grocery store and not sure which brand of kombucha to buy? Can’t tell if that pineapple is ripe? Now you can hop on a video call with your mom and get her advice based on what you see. Video calling on Ray-Ban Meta smart glasses is rolling out gradually, so if you don’t see the update right away, sit tight — it’s on the way!

Meta AI Makes Your Smart Glasses Even Smarter

From integrated audio to an ultra-wide 12 MP camera, Ray-Ban Meta smart glasses are jam-packed with tech. And in the US and Canada, you also get Meta AI — an intelligent assistant that helps you get things done, create and connect with the people and things you care about. Just say, “Hey Meta,” and ask away! You can control the glasses using voice commands, and thanks to Meta AI, even get access to real-time information.

We started testing a multimodal AI update in December, so you can ask your glasses about what you’re seeing, and they’ll give you smart, helpful answers or suggestions. That means you can do more with your glasses because now they can see what you see. Starting today, we’re rolling this functionality out to all Ray-Ban Meta smart glasses in the US and Canada in beta. 

Say you’re traveling and trying to read a menu in French. Your smart glasses can use their built-in camera and Meta AI to translate the text for you, giving you the info you need without having to pull out your phone or stare at a screen. 

Image showing a woman wearing Ray-Ban Meta smart glasses using Meta AI feature