Publikováno - Napsat komentář

Často kladené otázky: Oakley | Meta

What are Oakley Meta glasses?

Oakley Meta is the next evolution of AI glasses. Oakley and Meta have joined forces to bring AI glasses to sports and everyday lifestyles, combining Oakley’s eye-catching style with Meta’s innovative wearable and AI technology. Built for life on the field, trail, or track, with everyday use cases, Oakley Meta is here to evolve sport and enhance performance.

Co dělají brýle Oakley Meta?

Designed for athletes and fans alike, Oakley Meta glasses combine innovative AI with bold, wearable design.

With Oakley Meta’s glasses, you can:

  • Capture high-quality video and photos hands-free with a built-in ultra-wide 12MP camera.
  • Listen to music, podcasts, and more through Bluetooth speakers seamlessly integrated into the frames
  • Make and take phone calls hands-free
  • Livestream your adventures, travels, or daily life
  • Use Meta AI for instant information and assistance—just say, “Hey Meta.”

Jaké jsou brýle Oakley Meta ve srovnání s brýlemi Ray-Ban Meta?

Brýle Oakley Meta jsou všestranné chytré brýle určené pro sportovce a každodenní nošení na hřišti, na stezce nebo na trati.

V porovnání s brýlemi Ray-Ban Meta jsou brýle Oakley Meta

  • Pořizování fotografií pomocí 12Mpx fotoaparátu
  • Snímání videa ve vyšší kvalitě v rozlišení 3k
  • Delší výdrž baterie, až 8 hodin běžného používání
  • K dispozici jsou čočky Oakley s technologií Prizm a Prizm Polarized pro ochranu proti nárazům a UV záření.

Jaké styly jsou k dispozici v limitované edici?

The Limited Edition Oakley Meta HSTN celebrates 50 years of Oakley design, paired with Meta innovation. Featuring gold accents and 24K Prizm Polar lenses, it’s built for a new era of sport and style.

Are Oakley Meta glasses waterproof?

Oakley Meta glasses, along with the built-in Bluetooth speakers and camera, are water-resistant with an IPX4 rating—suitable for splashes, rain, and sweat. They are not designed to resist submersion or extended exposure to water or other liquids.

Nabíjecí pouzdro uchovávejte mimo dosah kapalin, protože není voděodolné a působení kapalin by mohlo vést k jeho poškození nebo poškození jiného majetku a zranění.

Do Oakley Meta glasses have Meta AI?

Yes. Meta AI is built into every pair, helping athletes and everyday users get the most out of every moment. Just say ‘Hey Meta’ or tap the frames to get real-time answers, discover new activities, and stay connected.

What questions can I ask Meta AI?

“Powered by Meta AI, you can use “”Hey Meta”” with your AI glasses to answer questions and get information in-the-moment. Learn more about your world by asking things like:

  • “Hey Meta, what are some good nearby hikes?”
  • “Hey Meta, when is high tide?”
  • “Hey Meta, is it going to snow tonight?”
  • “Hey Meta, play a song to go with my view.”
  • “Hey Meta, look at my fridge for smoothie ideas.”

Find out more about Meta AI.

Can I listen to music or podcasts on Oakley Meta Glasses?

Yes. Oakley Meta AI glasses come with discreet, open-ear Bluetooth speakers that do not block your hearing so you can retain your situational awareness and stay connected to the world around you. Connect your Bluetooth glasses to apps including Spotify, Apple Music, Amazon Music, iHeart, and Audible in the Meta AI app.

Say “Hey Meta” or tap the frames of the Bluetooth glasses to queue up your favorite hype music, mid-workout motivation, or victory song, hands-free.

Connected apps only available in select countries.

Do Oakley Meta glasses have a camera?

Yes. The ultra-wide 12MP camera built into the frames of Oakley Meta glasses takes high-quality photos and records videos in 3k for you to document the action from your unique point of view. The LED privacy light on your AI glasses lets people know when you’re taking a photo or capturing video to ensure others’ privacy is respected (if the LED is covered up, the camera will not function). Content is then synced to your smartphone via the Meta AI app.

Can I make phone calls with Oakley Meta glasses?

Yes. You can make and receive calls hands-free with crisp audio clarity.

Can I send texts with Oakley Meta glasses?

Yes. Send text, photos and videos, and voice messages on WhatsApp, Messenger, Instagram, and your phone with Oakley Meta glasses.

What do I need to use Oakley Meta glasses?

To operate Oakley Meta glasses, you need to connect your glasses with a smartphone and the Meta AI app (formerly known as the Meta View app).

Here’s the complete checklist:

  • Smart phone with a recently released operating system: Android 10 and above (with location services enabled) or iPhone 10 or above (running iOS 15.2 and above) review the complete list of compatible devices here
  • Bezdrátové připojení k internetu
  • USB-C charging cable and a plug (if charging from a power outlet and not directly from a USB port)
  • Platný účet Meta
  • Meta AI app (downloadable from your Apple a Google Play app store)

Are the glasses compatible with iPhone and Android?

Yes, Oakley Meta glasses are compatible with:

  • iPhone 10 or above (running iOS 15.2 and above)
  • Android devices (running Android 10 and above)

Do I need internet to use the glasses?

While many features can be used offline, Meta AI functionality and features require internet connectivity.

What accessibility features are available?

Oakley Meta glasses support hands-free interaction for users reduced visual, hearing, or mobility. Features include audio descriptions, hands-free call and message support, and access to the Be My Eyes volunteer network.

What is the battery life?

Your Oakley Meta glasses and charging case have different charge lifecycles.

HSTN glasses

  • Fully charged Oakley Meta HSTN glasses can last up to 5 hours of continuous audio playback, 8 hours of typical use and up to 19 hours on standby. This may vary with use and other factors
  • You can charge the AI glasses up to 80% in just 45 minutes
  • You can quickly charge the AI glasses to 50% in 22 minutes in the charging case

Nabíjecí pouzdro

  • A fully charged case provides up to 40 hours of charging for your Oakley Meta AI glasses. The number of hours varies according to how you’re using the features

Did Meta buy Oakley?

No, Meta did not buy Oakley. They are two separate and independent companies who collaborated together to create the Oakley Meta glasses collection.

Where can I buy Oakley Meta glasses?

You can buy Oakley Meta glasses on Ivanzo.com.

Where can I find health and safety info?

You’ll find Oakley Meta health and safety information in your AI glasses’ Safety and Warranty document.

Where can I learn more about privacy and data?

Additional privacy or data information is provided in the Meta Help Center and our privacy page.

Where can I get more support or help?

Další informace a kontaktní údaje na oddělení péče o zákazníky naleznete na adrese Meta Help Center.

Jaký je doporučený věk pro používání brýlí?

Oakley Meta glasses are for ages 13 and up. You can find more information about the age of use by visiting the Safety and Warranty page.

Jaké jsou podmínky vrácení a záruky?

Standardní návraty
Naše standardní pravidla pro vrácení zboží jsou 30 dní od odeslání objednávky. Po vrácení brýlí Oakley Meta budou brýle zkontrolovány a bude vám vrácena částka na původní způsob platby. Poštovné se nevrací.

Pokud jste si zakoupili brýle Oakley Meta na Meta.com, můžete použít naše Najít objednávku na stránce, kde můžete identifikovat svůj nákup a zahájit vrácení zboží.

Pokud jste brýle AI zakoupili na stránkách Oakley.com, kontaktujte je a požádejte o vrácení peněz. Na vrácení se vztahují pravidla pro vrácení zboží v původním maloobchodě.

Vrácení záruky
Pokud se domníváte, že vaše brýle AI jsou vadné, a lhůta pro vrácení již uplynula, můžete otevřít záruční reklamaci u společnosti Oakley pro opravu nebo výměnu. Pro otevření reklamace budete potřebovat sériové číslo výrobku a doklad o koupi. Záruka je platná pouze jeden nebo dva roky od zakoupení, v závislosti na zemi nákupu.

Co je to zvuk do otevřených uší?

Open-ear audio lets you hear your surroundings while still enjoying private audio through the glasses. The speakers are positioned just above your ears, directing sound toward you without completely blocking out ambient noise. It’s perfect for staying aware of your environment while listening on the go.

How do the speakers in the glasses work?

AI Glasses connect to your phone over Bluetooth letting you play music and audio content through the built-in directional speakers. You can also control playback and volume using the touchpad on the right temple or with voice commands.

Can other people hear the audio from my Meta glasses?

Audio from Oakley Meta glasses is directional and designed to be heard primarily by you. In quiet environments or at higher volumes, others nearby may hear faint sound, but it’s generally low-profile and not disruptive.

Publikováno - Napsat komentář

Introducing Oakley Meta Glasses: Amplifying Human Potential

Since 1975, Oakley has pioneered sport and culture through breakthrough technology and future-forward design. And now, Meta and Oakley are teaming up to deliver a brand-new category of Performance AI glasses.

Oakley Meta glasses are a new product line that will combine Oakley’s signature design DNA with Meta’s industry-leading technology to give you deeper insights into your physical capabilities and help you share your biggest wins—on and off the field. The line will launch in a new global campaign starring Team Oakley athletes: World Cup winner Kylian Mbappé and three-time Super Bowl MVP Patrick Mahomes.

“For 50 years, we’ve been pushing the boundaries of what is possible, obsessed with solving unsolved problems,” says Caio Amato, Global President of Oakley. “Together with Meta we are setting new bounds of reality—It is not only about pushing forward performance, but also about amplifying human potential as never before. And this is just the first chapter of a new era for sports.”

Glasses have emerged as the most exciting new hardware category of the AI era, hands-down (and hands-free). We’re proud to lead the market with Brýle Ray-Ban Meta, which have sold millions of units since launch. And now, we’re expanding our partnership with EssilorLuxottica to build upon another iconic, global brand.

Let’s dig into what this super team has been cooking up.

Introducing Oakley Meta HSTN

Our first product for athletes and fans alike, Oakley Meta HSTN (pronounced HOW-stuhn) combines bold aesthetics with cutting-edge tech. Capture the action completely hands-free with the built-in camera and share your unique POV. Get pumped up with your favorite playlist, listen to podcasts, and more, thanks to powerful open-ear speakers seamlessly integrated into the frames. And with an IPX4 water resistance rating, you can push yourself to the outer limits of your potential.

“This marks a bold new chapter in our wearables journey. With Oakley’s iconic design legacy and Meta’s breakthroughs in AI and spatial computing, we’re setting a new standard for the industry,” notes Rocco Basilico, Chief Wearables Officer at EssilorLuxottica. “It’s part of a broader multi-brand, multi-technology strategy that reflects the scale of our ambition: to build a connected eyewear category that spans lifestyles, communities, and use cases. We’re combining design, utility, and emotion to deepen human connection and unlock new potential through the lens of every brand we touch. And there is far more to come.”

Oakley Meta HSTN is taking the game to the next level with features that represent the evolution of AI glasses:

  • More Battery Stamina: A fully charged pair of Oakley Meta HSTN glasses can last up to eight hours of typical use and up to 19 hours on standby. You can charge them up to 50% in just 20 minutes. The glasses also come with a charging case that can deliver up to 48 hours of charging on the go.
  • Higher Resolution Camera: Capture your activity and share your achievements in Ultra HD (3K) video.
  • Meta AI Meets Performance AI Glasses: With Meta AI, your personal AI assistant, built in, athletes can get more out of their Oakley Meta HSTNs right out of the box. For example, you can level up the competition in a whole new way when playing a round of golf. Need to know how the wind is going to affect your drive? Ask, “Hey Meta, how strong is the wind today?” and channel your inner Team Oakley Athlete J.R. Smith. Or, like Boo Johnson, record epic moments and post to Stories, hands-free, by saying, “Hey Meta, take a video.” Get answers to a range of questions, whether you’re improving your game or checking the surf conditions.

Design Details

Select pairs of Oakley Meta HSTN glasses come with Oakley® PRIZM™ Lens technology, one of the most advanced innovations in lens design. By decoding how the brain and eye process light, Oakley engineered a way to enhance vision and amplify contrast across changing light and weather conditions. This revolutionary technology fine-tunes the light spectrum, amplifying color while filtering out visual noise. Proprietary PRIZM™ dyes manipulate light at a molecular level, enhancing detail and transforming your surroundings into vivid, high-definition clarity. This allows subtle visual cues to come sharply into view, helping wearers see more, react quicker, and perform at their peak.

Of course, it’s good to have options. That’s why we’ve put together a lineup of six frame and lens color combos, all Rx-ready so you can pick the perfect pair:

  • Oakley Meta HSTN Warm Grey with PRIZM™ Ruby Lenses
  • Oakley Meta HSTN Black with PRIZM™ Polar Black Lenses
  • Oakley Meta HSTN Brown Smoke with PRIZM™ Polar Deep Water Lenses
  • Oakley Meta HSTN Black with Transitions® Amethyst Lenses
  • Oakley Meta HSTN Clear with Transitions® Grey Lenses
  • Oakley Meta HSTN Black with Clear Lenses

We’re kicking off with a Limited Edition Oakley Meta HSTN that honors 50 years of Oakley design, while charging into a new era of innovation, featuring gold accents and gold 24K PRIZM™ Polar lenses.

Product Availability

The Limited-Edition Oakley Meta HSTN will be available for preorder starting July 11 for $499 USD, with the rest of the collection starting at $399 USD dropping later this summer. Oakley Meta HSTN will be available in the US, Canada, UK, Ireland, France, Italy, Spain, Austria, Belgium, Australia, Germany, Sweden, Norway, Finland, and Denmark. We’re also working to bring Oakley Meta HSTN to Mexico, India, and the United Arab Emirates later this year.

Get a first glimpse of Oakley Meta HSTN in a global campaign featuring elite athletes from Team Oakley and beyond—from Gabriel Medina checking out the surf before carving lines to athlete J.R. Smith judging the wind on the golf course, Ishod Wair and Boo Johnson skating different spots and capturing it all, and Bicho Carrera operating on another altitude. The next evolution of AI glasses will also debut at marquee sporting events, starting with Fanatics Fest June 20 – 22, followed by UFC International Fight Week, June 25 – 27, with more to come this year.

Learn more about Oakley Meta HSTN and sign up to be among the first to know about availability zde and at Oakley.com.

Oakley Meta glasses are the latest chapter of a multi-year partnership between Meta and EssilorLuxottica. Meta and Oakley’s work is rooted in years of joint development, ground-up engineering, and an understanding of what drives athletes. We can’t wait to see how they change the game.

Publikováno - Napsat komentář

Coming Soon! Limited Edition Oakley Meta HSTN Performance AI Glasses | Ivanzo.com

Game. Changer.

Preorder Oakley Meta HSTN Limited Edition on July 11 at 699 €. Full lineup arrives this summer, starting at 599 €.

Athletes have questions. Meta AI has answers.

Get hands-free access to answers, playlists and more with Oakley Meta HSTN.

JR Smith, Basketball World Champion and Collegiate Golfer

Bring home the highlights

Oakley Meta HSTN’s integrated camera lets you capture all the action as it happens, hands-free. Share your best shots to Facebook, go live on Instagram and connect with others in real time.

Never miss a beat

Built-in Bluetooth speakers let you enjoy your favorite music and podcasts right from your glasses. Stay in the zone for warmups, workouts or victory dances. Rock out with Spotify, Apple Music, Amazon Music and more.

Ishod Wair, Skateboard Icon

Stay close, no matter the distance

Connect like never before with Oakley Meta glasses. Send messages on WhatsApp, Messenger and more. Take a call right from your glasses. With open-ear speakers, you can stay engaged without losing touch with the world around you. Plus, video call and share your POV with the built-in camera.

Gabriel Medina, 3X World Surfing Champion

Oakley on the outside, Meta on the inside

Dive into the details. Tap the hotspots below to learn more about Meta Oakley HSTN’s features and technology.

You’re in control of your experience

Meta AI app on smartphone

Meta AI app

Manage your glasses and customize settings, import and share photos, and interact with your personalized Meta AI assistant on both your glasses and the app. Learn about new features and see how others are getting the most out of Meta AI through the feed.
The Meta AI app is required to support your glasses experience.

Close up of white Oakley Meta AI Glasses with Capture LED light featured

Capture courteously

The Capture LED light on the corner of Oakley Meta HSTN lights up to let others know when you’re capturing video, photos or going live. If you want to use the camera and the Capture LED light is covered or obscured, your glasses will notify you to clear it before you can begin.

Basketball player leaning against fence wearing Oakley Meta AI Glasses highlighting accessibility features

Accessibility

Oakley Meta glasses offer a range of helpful features for individuals with reduced vision, hearing, or mobility. Tasks like making audio or video calls, sending text messages and taking photos can all be done hands-free, just by using your voice. For those with reduced vision, Meta AI can provide detailed descriptions about the world around you or connect you with the Be My Eyes network, which offers assistance from sighted volunteers.

Publikováno - Napsat komentář

Nejchytřejší stíny ve městě: Brýle Oakley × Meta přicházejí 🌇🎧

Imagine this: You’re riding a city bike at sunset, music playing through your glasses, your hands firmly on the handlebars. You say “Hey Meta, take a photo,” and your glasses snap the perfect skyline shot — all without ever reaching for your phone.

On 20. června, Meta and Oakley are set to release what could become the ultimate lifestyle-smart glasses combo — merging urban tech with outdoor spirit.

Not Just for Pros — Built for Everyday Explorers

While Oakley is known for performance eyewear worn by Olympians and X-Games champions, these new smart glasses seem to be for all of us — commuters, weekend hikers, rollerbladers, runners, and digital creators on the go.

Expected features include:

  • 📸 Hands-free camera for spontaneous moments
  • 🎧 Open-ear speakers for music and calls without blocking outside sounds
  • 🧠 Meta AI Assistant at your command
  • 🌐 Live streaming & instant sharing to Instagram, WhatsApp, Facebook

Whether you’re walking your dog or skating the boardwalk, everything you see — and how you see it — is now connected.

Urban-Ready Design, Sport-Grade Performance

Oakley brings the hardware muscle: light, curved lenses, anti-slip temples, breathable fit, and UV protection. Meta provides the brain: smart UI, gesture or voice control, and AI capabilities. Together? A pair of glasses that’s stylish enough for the café, strong enough for the trail.

You can expect a wrap-around look similar to Oakley’s “Sphaera” model — but with a slick, high-tech twist.


Use Cases You Didn’t Know You Needed

  • Record your morning commute POV style
  • Start a call with your team while walking in the park
  • Ask AI to translate a street sign while traveling
  • Control music during your jog with just your voice
  • Capture your workout form for later analysis

These glasses aren’t trying to replace your phone. They’re designed to free you from it — while keeping you connected.


Who Are They Really For?

These glasses might just hit the sweet spot for:

  • Urban creatives
  • Fitness-focused professionals
  • Cycling enthusiasts
  • Travel vloggers
  • Anyone tired of pulling out their phone every 30 seconds

It’s wearable tech that doesn’t ask you to change your habits — it adapts to them.


Countdown to the Future

Launching June 20, the Oakley × Meta smart glasses are shaping up to be more than just a gadget. They’re a statement: Technology should be wearable, natural, and invisible — until you need it.

So whether you’re headed to the city, the forest, or somewhere in between, you might just want to bring your new Oakleys with you.

Because your next adventure deserves more than just vision — it deserves intelligence.

Publikováno - Napsat komentář

Oakley × Meta: Revoluce v oblasti sportovních chytrých brýlí startuje 20. června 🎯

20. června 2025, společnost Meta rozšiřuje svou nositelnou technologii do světa sportu díky spolupráci se značkou Oakley, která je synonymem pro výkonnostní brýle. Po masivním úspěchu chytrých brýlí Ray-Ban Meta, které se staly kulturním fenoménem mezi tvůrci obsahu a technologickými nadšenci, chce společnost Meta ovládnout oblast aktivního životního stylu s novou generací chytrých sportovních brýlí.

Co zatím víme

  • Datum uvedení na trh: 20. června
  • Cílová skupina: Sportovci, cyklisté, běžci, bruslaři, outdooroví dobrodruzi.
  • Produktová řada: Pravděpodobně vychází z modelu Sphaera od společnosti Oakley, který byl přepracován a doplněn o chytré funkce, včetně centrálně umístěné přední kamery.

Očekávané funkce a design

Výkon se setkává s inteligencí

Očekáváme robustní a zároveň aerodynamický rám, věrný sportovnímu dědictví značky Oakley, ale poháněný zevnitř inteligentní platformou Meta. Očekávaný hardware zahrnuje 12Mpx fotoaparát, směrové mikrofony, reproduktory s otevřeným uchem zabudované ve spáncích a podporu hlasových příkazů prostřednictvím aplikace "Hey Meta".

Umístění kamery

Na rozdíl od brýlí Ray-Ban Meta s bočními kamerami může být model Oakley vybaven středovou čočkou - ideální pro POV video při vysokorychlostních aktivitách, jako je jízda na kole nebo běh na stezce.

Baterie a odolnost

Vzhledem ke sportovnímu zaměření očekáváme zvýšenou výdrž baterie, odolnost proti povětrnostním vlivům podle normy IP a zvýšenou odolnost rámu. Očekává se, že cena se bude pohybovat kolem $360, což je o něco více než cena Ray-Ban Meta $299.

Možné pokročilé funkce

  • Živý přenos POV na Instagram nebo Facebook
  • Vylepšení umělé inteligence jako je překlad v reálném čase, skenování QR nebo překrývání tréninkových dat.
  • Indikátor soukromí LED pro signalizaci, že je aktivní nahrávání

Předpokládaný vzhled a pocit

Představte si futuristický design sportovního štítu - široké, zakřivené čočky pro plné pokrytí, robustní nosní můstek, polstrované spánky a viditelný modul kamery. Pravděpodobně budou převládat matné nebo kovové povrchy, které zůstanou věrné agresivnímu a funkčnímu stylu Oakley.


Proč na tom záleží

Tento hybrid sportovních brýlí a chytrých technologií by mohl nově definovat tvorbu nositelného obsahu. Představte si video na úrovni GoPro z linie vašich očí, ale bez držáku na helmu. Sportovci by mohli sledovat formu, sdílet zajímavé momenty v reálném čase nebo dostávat zvukové pokyny během výkonu. Není to jen chytré - je to hladce začleněno do vašeho běhu.


Závěrečné myšlenky

Nový Chytré sportovní brýle Oakley × Meta se chystají učinit odvážné prohlášení, až budou 20. června uvedeny na trh. Pokud se očekávání naplní, spojí se v nich výkonný design s chytrými funkcemi způsobem, který jsme dosud neviděli - jsou stvořeny pro lidi v pohybu a jsou určeny pro budoucnost digitálního hands-free.

Publikováno - Napsat komentář

Za brýlemi: Oakley × Meta se chystají změnit definici lidského vidění 🧠🕶️

When Meta collaborates with a brand like Oakley, it’s not just about smart glasses anymore — it’s about reinventing what it means to see a experience the world. On 20. června, the tech world is bracing for what could become the most significant leap in wearable AR since the original iPhone moment.

This Isn’t Just About Style. It’s About Superpowers.

Oakley is synonymous with elite sports performance. Meta is synonymous with artificial intelligence and human-computer fusion. Now imagine:

  • Glasses that let you record your best downhill run in stunning POV.
  • Navigate a trail with real-time AR cues projected onto the lens.
  • Get instant stats about your surroundings — speed, temperature, even heart rate, via connected sensors.
  • Talk to your glasses and get instant responses from an AI assistant.

That’s not fashion. That’s augmentation.

Design: The Birth of the Sport-Tech Hybrid

Expect a sleek, wraparound frame built for speed, stability, and sweat. Think carbon or titanium-like textures. A discreet front camera sits flush in the middle, almost like a third eye. Inside: haptics, micro speakers, voice-activated UI, and gesture sensors.

These aren’t glasses for walking down Fifth Avenue — these are glasses for conquering mountains, trails, and marathons.

What Will They Actually Do?

While Meta hasn’t officially confirmed specs, leaks and patterns suggest:

  • Voice-activated camera (12MP or better)
  • Live streaming directly to social platforms
  • Fitness integration with Meta Quest Move or similar trackers
  • AI-driven enhancements, like:
    • Auto-captioning your videos
    • Object and environment recognition
    • Real-time translation for travelers

And possibly: adaptive lens tinting depending on light conditions or environment.


Is This the Start of a New Category?

Ray-Ban Meta proved there’s demand. But Oakley × Meta might finally answer the question: Can smart glasses be essential for real-life performance, not just fun?

The answer might come not from Silicon Valley, but from your next morning ride, trail run, or hike. When your sunglasses help you train better, remember more, or stay safer — why would you ever take them off?


Final Word

The Oakley × Meta smart glasses, launching June 20, could mark a paradigm shift. Not just tech on your face — but intelligence at your command, through a design trusted by Olympic-level athletes.

It’s not science fiction anymore. It’s just… wearable.

Publikováno - Napsat komentář

Watch Chris Pratt & Hemsworth Use AI Glasses in First of Two Ray-Ban Meta Ad Spots

T-minus six days until a certain Sunday known for its parade of clever, cheeky, and often over-the-top commercials—oh, and a pretty big football game, too. If you’ve been following along, you know that Chris Pratt a Chris Hemsworth are starring in two ad spots for Ray-Ban Meta where they’ll sport our iconic AI glasses. And today, the wait is (partially) over as we revealed one of the 30-second spots on the TODAY show.

https://youtube.com/watch?v=1oBPHKpj_zA%3F%26controls%3D0%3Fautoplay%3D0%26color%3Dwhite%26cc_lang_pref%3Den%26cc_load_policy%3D0%26enablejsapi%3D1%26frameborder%3D0%26hl%3Den%26rel%3D0%26origin%3Dhttps%253A%252F%252Fwww.meta.com

While the Chrises have undeniable charisma and on-screen chemistry, we decided to up the ante with the one and only Kris Jenner. Directed by Matthew Vaughn, the end result is, if we might be so bold, a bit of a masterpiece—not unlike the $6.2 million banana at its center. 🍌

Be sure to tune in this Sunday, February 9, to catch both commercials and see who brings home this year’s ring—if you’re into that sort of thing. 🏈

Read on for more details and some helpful FAQs.

The Big Game is all about peak performance, pageantry, and some good, old-fashioned rivalry. What better way to introduce Ray-Ban Meta glasses to new audiences than by tapping into that spirit with Chris Hemsworth and Chris Pratt?

One half of the famed Hollywood Chrises, both Pratt and Hemsworth feature heavily in the Marvel Cinematic Universe, or MCU, where their characters enjoy a healthy rivalry of their own—whose ship is it, really?

Plot Twist

In the ad spot unveiled today, we open on Chris Pratt using his AI glasses to identify a piece of fine art—specifically, Comedian by Maurizio Cattelan. A set of three, Comedian is a series of bananas duct taped to a wall, one of which sold at auction in November 2024 for a cool $6.2 million.

Pratt continues exploring the space, which we might assume is a fine art gallery or museum, and is joined by Chris Hemsworth, who disparages the snack bar while munching on—you guessed it—a banana.

Following a perfectly timed double-take, Pratt—horrified—looks back to find a piece of duct tape hanging from the wall, sans banana.

After Pratt informs Hemsworth of the value of said produce, the pair begin rifling through a refrigerator in search of a replacement banana.

Famed reality TV star Kris Jenner appears, and we learn that the Chrises aren’t in a museum or gallery after all—they’re in her home.

She asks, “Who eats art?” The Chrises then (perhaps not-so-helpfully) explain to her how to trigger Meta AI with her own pair of Ray-Ban Meta glasses by saying, “Hey Meta.” At which point, she responds with, “Hey Meta, call my lawyer.”

Fun Facts

This isn’t Meta’s first time running an ad during the Big Game. We previously ran a spot for Meta Quest 2 back in 2022.

Often starring opposite each other in the Marvel Cinematic Universe, Chris Pratt and Chris Hemsworth both have great comedic timing, making them a natural fit for this ad spot’s light-hearted rivalry.

A media mogul with her own undeniable star power, Kris Jenner—who’s shaped the careers of her children, the Kardashians and Jenners—delivers an unexpected cameo to round out the ad spot.

Both ad spots are directed by Matthew Vaughn, known for his work on ArgylleX-Men: First Class, and the Kingsman franchise.

To celebrate the Big Game, we’re creating a limited number of two special edition AI glasses from Ray-Ban Meta (not available for purchase): Matte Black Wayfarers with gold mirrored lenses and Matte Black Wayfarers with lenses that feature the participating teams’ colors. The exclusive shades come with a custom laser-etched case.

Top 5 Ways to Up Your Game During the Big Game with Ray-Ban Meta

If you’re tuning into the Big Game this Sunday, February 9, Ray-Ban Meta glasses are the perfect sidekick to enhance your viewing experience and secure your superfan status. Not only can you use them to capture your friends’ reactions to every touchdown, interception, and controversial call (maybe the refs could use a prescription pair of their own AI glasses, amirite?)—all without fumbling with your phone—you can also get real-time stats and other game info without looking away from the action on screen.

Here are our top 5 tips for using Ray-Ban Meta glasses to make the most of the Big Game:

  1. Discover themed cocktails and small bites to take your watch party to the next level. Hey Meta, I’m hosting a party for the Chiefs. What’s a themed drink and snack I can make?
  2. Go deep with stats and learn more about the Eagles and Chiefs. Hey Meta, who coaches the Eagles? Hey Meta, who wears jersey #87 on the Chiefs?
  3. Find local sports bars and pubs where you can watch the Big Game or head to for an after party. Hey Meta, are there Philadelphia-themed sports bars near me?
  4. If you’re new to football, you can learn the ins and outs of the game to help you follow along. Hey Meta, what’s a two-point conversion? Hey Meta, tell me about special teams.
  5. Last but not least, check out a fun Easter egg rolling out today. Just put on your AI glasses and ask: Hey Meta, who eats art?

Často kladené otázky

What are Ray-Ban Meta glasses?

Our AI glasses combine iconic style and cutting-edge technology to let people stay present in the moment and connected to the people and things they care about most.

How many frame options are there?

Ray-Ban Meta glasses come in classic Wayfarer, cat-eye Skyler, and bold Headliner styles. With a variety of colorways and lens options available, you can customize them to get the look that’s right for you.

When will the ad spots air?

Both commercials will air during the Big Game on Sunday, February 9. Mark your calendars.

Is that banana a real piece of art?

The one Chris Hemsworth ate? Probably not. 😉

But yes! Comedian is a real work of art (three, actually) by Maurizio Cattelan.


Updated January 28, 2025:

Blížíme se k Velké hře - dá se říct, že získáváme yardy - a my se opět podíváme na náš nadcházející projekt. Brýle Ray-Ban Meta reklamní kampaň s Chris Pratt a Chris Hemsworth.

https://youtube.com/watch?v=t5dsvCskq0Y%3F%26controls%3D0%3Fautoplay%3D0%26color%3Dwhite%26cc_lang_pref%3Den%26cc_load_policy%3D0%26enablejsapi%3D1%26frameborder%3D0%26hl%3Den%26rel%3D0%26origin%3Dhttps%253A%252F%252Fwww.meta.com

In the ad spots, the charismatic duo—each sporting a stylish pair of AI glasses—will show off the tech, getting real-time information like text translation and fossil identifications on the fly.

https://youtube.com/watch?v=lOftIbY4fFE%3F%26controls%3D0%3Fautoplay%3D0%26color%3Dwhite%26cc_lang_pref%3Den%26cc_load_policy%3D0%26enablejsapi%3D1%26frameborder%3D0%26hl%3Den%26rel%3D0%26origin%3Dhttps%253A%252F%252Fwww.meta.com

You might be wondering what the setting is for these vignettes. High-end art gallery? Some sort of postmodern museum? The home of an iconic reality TV star?

Time will tell…


Originally published January 23, 2025:

What do Chris Pratt a Chris Hemsworth have in common (besides starring roles in a certain cinematic universe (and in our hearts))? They’re teaming up to star in a new ad campaign for Brýle Ray-Ban Meta, with two spots airing during the Big Game on February 9.

https://youtube.com/watch?v=bplLA4bBy84%3F%26controls%3D0%3Fautoplay%3D0%26color%3Dwhite%26cc_lang_pref%3Den%26cc_load_policy%3D0%26enablejsapi%3D1%26frameborder%3D0%26hl%3Den%26rel%3D0%26origin%3Dhttps%253A%252F%252Fwww.meta.com

We’re playing things pretty close to the vest for now, but we can tell you that the creative spots are directed by Matthew Vaughn, known for his work on ArgylleX-Men: First Class, and the Kingsman franchise.

With iconic Ray-Ban style, cutting-edge tech, and Meta AI* built in, our AI glasses let you capture the moments that matter, take calls with friends and family, listen to your favorite playlists and podcasts, livestream from your point of view, and more—all hands-free.

https://youtube.com/watch?v=009WQaCtzfw%3F%26controls%3D0%3Fautoplay%3D0%26color%3Dwhite%26cc_lang_pref%3Den%26cc_load_policy%3D0%26enablejsapi%3D1%26frameborder%3D0%26hl%3Den%26rel%3D0%26origin%3Dhttps%253A%252F%252Fwww.meta.com

The talent. The teasers. The tech.

What does it all mean? Tune in during the world’s biggest American football game to find out.

Publikováno - Napsat komentář

Od nuly k jedné: Jak náš vlastní křemík a čipy přinášejí revoluci do rozšířené reality

In 2017, Reality Labs Chief Scientist Michael Abrash, backed by Meta Founder & CEO Mark Zuckerberg, established a new, secret team within what was then Oculus Research to build the foundation of the next computing platform. Their herculean task: Create a custom silicon solution that could support the unique demands of future brýle pro rozšířenou realitu—a technical feat that required reimagining every component for an all-day wearable AR glasses form factor that simply didn’t exist yet.

Front angle of Orion AR glasses prototype.

Compact Form Factor, Considerable Problem Space

Growing from just a few researchers to hundreds of people on the product side, the custom silicon team was built on the premise that AR glasses couldn’t rely on the silicon available in today’s smartphones. And it’s a premise borne out by the several custom chips inside Orion, our first true AR glasses product prototype.

“Building the ship as it sails out of the harbor—that was exactly what we were doing,” says Advanced Technologies Strategy Director Jeremy Snodgrass. “We had to scale up what was a modest team while building these chips simultaneously. It was fascinating to see the leadership bring new hires on board while developing a culture that values agility. Nothing was someone else’s problem. You’d pick up an oar and start rowing, even if it wasn’t exactly what you were hired to do. It was a very, very exciting time.”

“Almost everything about Orion was new to the world in a lot of ways,” agrees Display Architecture Technical Director Mike Yee. “Some of the ideas had been out there, but no one had taken on the research project to actually build an all-day wearable pair of AR glasses.”

The team had to deliver a compelling AR experience with as little power consumption as possible. The glasses form factor can only dissipate so much heat, and it can only accommodate so much battery capacity. As a result, the experiences it’s possible to deliver on the glasses are completely dependent upon the silicon. In other words, if you hold constant the thermal and battery capacities, the only way to deliver a given experience is to optimize the silicon.

“Designing a new architecture for Orion required the team to not only push existing technologies like wireless and displays aggressively but also to take risks on new technologies,” says Director of Product Management Neeraj Choubey. “For instance, the team developed a machine learning (ML) accelerator without a clear use case at the time, driven by the strong conviction that ML would become increasingly important in Meta’s products. On Orion, every ML accelerator is utilized, and in some cases, they’re oversubscribed, serving functions such as eye tracking and hand tracking. Similarly, the team developed custom compression protocols to reduce bandwidth and power consumption as data moved from the compute puck to the display. Developing custom silicon to achieve Orion’s form factor goals required both a tolerance for high ambiguity and meticulous attention to detail to deliver an incredibly complex system architecture.”

“The biggest challenge we had was delivering the 3D world-locked rendered graphics, along with spatialized audio that’s rendered such that it appears to be emanating from a virtual object,” notes Snodgrass. “We had to fit all these electronics into the thermal capacity and physical space, and then run it on the battery so that it doesn’t get too hot. And we had to do all that in an actual glasses form factor—not a big visor like you typically see in the category.”

“We knew how to deliver the necessary compute power to bring our vision for Orion to life, but we faced a daunting task: reducing power consumption by a factor of 100,” says Director of SoC Solutions Robert Shearer. “This required us to push the boundaries of silicon design, embracing methodologies from diverse corners of the industry—from IoT to high-performance computing—and inventing new approaches to bridge the gaps. Our industry partners thought we were crazy, and perhaps they weren’t entirely wrong. But that’s exactly what it took: a willingness to challenge conventional wisdom and rethink everything. Building a computer that seamlessly merges the virtual and physical worlds demands a profound understanding of context, far surpassing what existing compute platforms can offer. In essence, we’ve reinvented the way computers interact with humans, which meant reimagining how we build silicon from the ground up.”

Orion exterior components.

The Magic of MicroLEDs

There were moments when things fell behind schedule or a seemingly insurmountable technical challenge arose when it was hard to maintain momentum and morale. But the team was resilient, finding paths around obstacles—or simply knocking them down.

Take, for example, Orion’s display. The silicon team was responsible for the silicon in the display projector that sits in the glasses’ corners.

“For those projectors, there was an open question as to whether we could get the microLEDs in an array at high enough efficiency and brightness to deploy a wide field of view display,” Snodgrass says. “There was tremendous doubt that we could—that it was possible in the time frame we were considering—because this was very nascent technology.”

“We realized very early on that we had to rethink a lot of the paradigms of product development,” adds Yee. “The amount of light that you need to make a usable display is quite a bit brighter for AR glasses because, as a wearable display, you’re competing with the sun. So we need energy levels that rival that—or at least that’s the goal. We’re not quite there yet, but that’s a big piece of it. And that means you need light sources for a display that are capable of that and you need circuitry that can control that. And at the same time, you need to make it tiny.”

Custom silicon driving Orion’s µLEDs.

While microLEDs seemed the most suitable light source for the projectors, silicon helped unlock their potential.

“With displays, we talk about pixel pitches, which are the distances between the center points of adjacent pixels,” Yee explains. “For TVs, those distances are hundreds of microns. In your phone, there are many, many tens of microns. And we needed to get that down into the single digits. The only known semiconductor manufacturing that could get there was silicon.”

Complicating the work was the fact that the back surface of the display had to be a piece of silicon, and no one in the world had been designing silicon for microLEDs.

“At the time, the research teams that were out there had all repurposed liquid crystal on silicon displays to put microLEDs on,” says Yee. “Nobody had ever designed a backplane for microLEDs before. And we faced a pretty unique challenge because it’s an optical component. It has to be flat. You can’t scratch it. It has to have all these characteristics because when you look through the waveguides, through the projectors, you’re literally looking at the top surface of the piece of silicon.”

The silicon team developed a complex series of test platforms for these microLED displays that involved coordinating closely with our global suppliers. The microLEDs have a global footprint, originating in one spot and then transferred to another site where they were put onto a wafer. The wafers were then shipped off to be cut into a particular shape, followed by a trip to the US to be bonded together with another wafer, then shipped back across the globe for the actual module to be built and tested. It was an enormously complicated process, and the silicon team developed test vehicles to prove out each step.

The team also needed to find a way to deliver power to microLED displays in the tiny volume of the glasses’ corners. Our analog team developed a custom power management chip that fit inside that volume.

“Power delivery is crucial for such small form factor wearable devices, where battery size is limited and space is at a premium,” notes Director of Analog and Mixed Signal Systems Jihong Ren. “Our customized Power Management IC solution leverages cutting-edge technologies to optimize power efficiency for our specific workload at the system level, all while fitting within the available space constraints. Achieving this optimal solution required close interdisciplinary collaboration with our mechanical, electrical, SoC, μLED, and thermal teams, ensuring a seamless integration of all components and maximizing overall performance.”

“That was an amazing feat of not just engineering but also organizational management: bringing a team together, working across time zones and with all these different vendors,” adds Snodgrass. “In some ways, managing all that organizationally was just as much of a challenge as meeting the technical specs.”

“Not only is the design custom, the entire fabrication process is customized,” adds Yee. “We’re fortunate to have some wonderful partners in the industry that helped make that happen. They see long-term potential in AR displays as a whole and certainly Meta’s vision for it. So they’ve been willing to partner with us to make those customizations and optimizations happen to enable that display.”

Orion AR glasses.

Iteration Meets Acceleration

There was a tight feedback loop between the silicon team and the brilliant minds in Reality Labs Research and XR Tech developing algorithms. The latter teams would provide these algorithms, which the former would translate into hardware, removing the overhead of general-purpose CPU-running software. That meant that the algorithms would run at lower power, but it also meant the silicon team was locked in. Once the algorithms were hardened, they could no longer make changes.

“Say XR Tech was developing a certain discipline algorithmically,” explains Silicon Accelerators Architecture and Algorithms Director Ohad Meitav. “They own the algorithmic stack and its performance. My team, in collaboration with them, would then decide how to accelerate the algorithm, how to harden parts of the algorithm, and how to actually put it into the hardware in a way that runs super efficiently. Then XR Tech would adapt their software stack to account for the hardware. It’s a very iterative process.”

Another success story is the silicon team’s collaboration with Reality Labs Research to come up with a novel reprojection algorithm.

“We needed the reprojection algorithm to support a variety of different distortions and corrections,” notes Silicon Architect Steve Clohset. “The algorithm RL-R developed that we ended up using is not used in general computing. And to this day, it’s proving to be a pretty powerful tool.”

Once the algorithms were hardened and the hardware was optimized, the silicon bring-up team put the custom chips through their paces.

“Orion’s custom silicon chipset is packed with complexity,” says Senior Director of End-to-End System and Infrastructure Liping Guo. “Bringing-up and validating standalone chips and the interoperabilities across them within a short period of time is incredibly challenging. Luckily, we’re operating within a vertically integrated environment where Reality Labs owns the entire stack—from the silicon and low-level firmware to the operating system, software, and the upper-layer experiences. We took full advantage of this, working closely with our cross-functional partners, and shift-left our cross-stack integration at the silicon validation stage. Orion was our pilot run for this methodology—we built our shift-left muscles and created a strong foundation for Reality Labs to reap the full benefits of custom silicon in the future.”

And after bring-up, it was time to optimize for software.

“There’s an iterative process where you start with a wholly unoptimized software stack, just to get everything booted up and running,” says Snodgrass. “And then, one by one, you go through the subsystems and start to optimize the software for that specific hardware—including reducing the amount of memory that the software uses. The hardware could be beautifully designed, but you won’t achieve the theoretical power efficiency unless you invest as much or more time getting the software to take full advantage of the hardware. So that’s the story of Orion: hardware and software optimized to the hilt. Leave no picojoule or milliwatt behind.”

And while Orion may be a prototype, the work that went into it has considerable potential to influence Meta’s roadmap.

“We look at the silicon IPs that we’re building as platforms in the sense that these are valued IPs that we’ll improve upon from one generation or one product to another,” adds Meitav. “All of the computer vision and graphics algorithms aren’t just built for Orion. They’ll go on to inform future products.”

Our silicon work involves creating novel solutions while also working closely with partners. In fact, the silicon team’s influence extends beyond Orion to both Brýle Ray-Ban Meta a Meta Quest headsets today—even though they both use third-party chips. The silicon team regularly shares their work with the mixed reality team, showing what’s possible in terms of power efficiency. The MR team then shares those findings with partners like Qualcomm to help inform future chip designs. And because the silicon team uses the same off-the-shelf digital signal processors (DSPs) as Ray-Ban Meta glasses, they’ve been able to share lessons learned and best practices when implementing and writing code for those DSPs to help improve the audio experiences available on our AI glasses.

And that knowledge sharing goes both ways: The MR team has given the silicon team insights on things like Asynchronous TimeWarp a Application SpaceWarp in production.

“What a person does in production is far more interesting than something that we could do with a prototype,” says Clohset. “We tried to integrate what they were doing with warpings as much as possible.”

Ambiguity < Ambition

Because Orion was truly zero to one, the teams involved had to deal with an inordinate amount of ambiguity by necessity.

“With Orion, I can’t overstate how complicated the ambiguity made things,” says Clohset. “When you make a computer, for example, you generally have a good idea of what the display will be. But we didn’t know what the waveguide would end up being, so we had to go sample different waveguides and come up with a mechanism that could handle the worst-case scenario because we didn’t know where things would land. One optimization here would convolve with all these other choices, and you would end up with this matrix of like all these different things that you had as support and try to validate because you didn’t know where the product would land in six months.”

It’s important to note that Orion isn’t just a pair of AR glasses—it’s a three-part constellation of hardware. A lot of the processing happens on the compute puck, which necessitates a strong connection between it and the glasses. Add the surface EMG wristband into the loop, and the system architecture becomes even more complicated.

“That was enormously challenging for the teams to solve, and all of it just works” says Snodgrass. “It was an amazing collaboration between the silicon team, the wireless team, and software teams all across the organization.”

Orion’s compute puck.

“With Orion, we built out a whole team with a broad diversity of engineers, and they were able to design a brand-new pipeline,” adds Clohset. “It’s a pipeline that handles six degrees of freedom movement of objects in 3D space. It uses its own custom display driver. We had to do really unique image quality corrections. And I think the challenge for us was that, because this was a zero-to-one project, there was no existing spec to tweak and improve. Everything here is completely new, so we had latitude to do everything.”

Much like the puck has some dormant features hidden beneath the surface, the custom silicon punches above its weight, too. While Orion doesn’t allow the user to take photos with its RGB cameras, the silicon is capable of supporting it, as well as codec avatars. And just as the puck helped unlock a true glasses form factor by offloading much of the compute, Orion’s custom silicon proved a necessary piece of the AR puzzle.

“To bring a zero-to-one experience like AR glasses to life, you need custom silicon—full stop,” Snodgrass explains. “Over time, if there’s a market, then silicon vendors will develop products to meet the demand. But for zero-to-one, you can’t just take something that exists off the shelf that’s intended for a different product and fit it into a new mold. You have to invest in something custom. And to bring those zero-to-one experiences to life, you need broad collaboration between software partners, industrial designers, mechanical engineers, and more.”

“By breaking free from traditional mental models, we’ve created something truly remarkable,” adds Shearer. “We believe that this computing platform represents the future of technology—one that will revolutionize the way we live, work, and interact with each other. We’re excited to be at the forefront of this innovation, pushing the boundaries of what’s possible and helping to shape the course of history.”


Další informace o společnosti Orion najdete v těchto příspěvcích na blogu:

Publikováno - Napsat komentář

Accelerating the Future

When we first began giving demos of Orion early this year I was reminded of a line that you hear a lot at Meta—in fact it was even in our first letter to prospective shareholders back in 2012: Code wins arguments. We probably learned as much about this product space from a few months of real-life demos than we did from the years of work it took to make them. There is just no substitute for actually building something, putting it in people’s hands, and learning from how they react to it.

Orion wasn’t our only example of that this year. Our mixed reality hardware a AI glasses have both reached a new level of quality and accessibility. The stability of those platforms allows our software developers to move much faster on everything from operating systems to new AI features. This is how I see the metaverse starting to come into greater focus and why I’m so confident that the coming year will be the most important one in the history of Reality Labs.

https://youtube.com/watch?v=xwMNZb6UE1o%3F%26controls%3D0%3Fautoplay%3D0%26color%3Dwhite%26cc_lang_pref%3Den%26cc_load_policy%3D0%26enablejsapi%3D1%26frameborder%3D0%26hl%3Den%26rel%3D0%26origin%3Dhttps%253A%252F%252Fwww.meta.com

2024 was the year AI glasses hit their stride. When we first started making smart glasses in 2021 we thought they could be a nice first step toward the AR glasses we eventually wanted to build. While mixed reality headsets are on track to becoming a general purpose computing platform much like today’s PCs, we saw glasses as the natural evolution of today’s mobile computing platforms. So we wanted to begin learning from the real world as soon as possible.

The biggest thing we’ve learned is that glasses are by far the best form factor for a truly AI-native device. In fact they might be the first hardware category to be completely defined by AI from the beginning. For many people, glasses are the place where an AI assistant makes the most sense, especially when it’s a multimodal system that can truly understand the world around you.

We’re right at the beginning of the S-curve for this entire product category, and there are endless opportunities ahead. One of the things I’m most excited about for 2025 is the evolution of AI assistants into tools that don’t just respond to a prompt when you ask for help but can become a proactive helper as you go about your day. At Connect we showed how Live AI on glasses can become more of a real-time participant when you’re getting things done. As this feature begins rolling out in early access this month we’ll see the first step toward a new kind of personalized AI assistant.

It won’t be the last. We’re currently in the middle of an industry-wide push to make AI-native hardware. You’re seeing phone and PC makers scramble to rebuild their products to put AI assistants at their core. But I think a much bigger opportunity is to make devices that are AI-native from the start, and I’m confident that glasses are going to be the first to get there. Meta’s Chief Research Scientist Michael Abrash has been talking about the potential of a personalized, context-aware AI assistant on glasses for many years, and building the technologies that make it possible has been a huge focus of our research teams.

Mixed reality has been another place where having the right product in the hands of a lot of people has been a major accelerant for progress. We’ve seen Meta Quest 3 get better month after month as we continue to iterate on the core system passthrough, multitasking, spatial user interfaces, and more. All these gains extended to Quest 3S the moment it launched. This meant the $299 Quest 3S was in many respects a better headset on day 1 than the $499 Quest 3 was when it first launched in 2023. And the entire Quest 3 family keeps getting better with every update.

In 2025 we’ll see the next iteration of this as Quest 3S brings a lot more people into mixed reality for the first time. While Quest 3 has been a hit among people excited to own the very best device on the market, Quest 3S is expected to be heavily gifted in 2024. Sales were strong over the Black Friday weekend, and we’re expecting a surge of people activating their new headsets over the holiday break.

This will continue a trend that took shape over the last year: growth in new users who are seeking out a wider range of things to do with their headsets. We want these new users to see the magic of MR and stick around for the long run, which is why we’re funding developers to build the new types of apps and games they’re looking for.

There’s been a significant influx of younger people in particular, and they have been gravitating toward social and competitive multiplayer games, as well as freemium content. And titles like Skydance’s BEHEMOTHBatman: Arkham Shadow (which just won Best AR/VR game at The Game Awards!), and Metro Awakening showed once again that some of the best new games our industry produces are now only possible on today’s headsets.

As the number of people in MR grows, the quality of the social experience it can deliver is growing in tandem. This is at the heart of what Meta is trying to achieve with Reality Labs and where the greatest potential of the metaverse will be unlocked: “the chance to create the most social platform ever” is how we described it when we first began working on it. We took two steps forward on this front in 2024: first with a broad set of improvements to Horizon Worlds including its expansion to mobile, and second with the next-generation Meta Avatars system that lets people represent themselves across our apps and headsets. And as the visual quality and overall experience with these systems improve, more people are getting their first glimpse of a social metaverse. We’re seeing similar trends with new Quest 3S users spending more time in Horizon Worlds, making it a Top 3 immersive app for Quest 3S, and people continue creating new Meta Avatars across mobile and MR.

Getting mixed reality into the mainstream has helped illuminate what will come next. One of the first trends that we discovered after the launch of Quest 3 last year was people using mixed reality to watch videos while multitasking in their home—doing the dishes or vacuuming their living rooms. This was an early signal that people love having a big virtual screen that you can take anywhere and place in the physical world around you. We’ve seen that trend take off with all sorts of entertainment experiences growing fast across the whole Quest 3 family. New features like YouTube Co-Watch show how much potential there is for a whole new kind of social entertainment experience in the metaverse.

That’s why James Cameron, one of the most technologically innovative storytellers of our lifetimes, is now working to help more filmmakers and creators produce great 3D content for Meta Quest. While 3D films have been produced for decades, there’s never been a way to view them that’s quite as good as an MR headset, and next year more people will own a headset than ever before. “We’re at a true, historic inflection point,Jim said when we launched our new partnership this month.

This is happening alongside a larger shift Mark Rabkin shared at Connect this year: Our vision for Horizon OS is to build a new kind of general purpose computing platform capable of running every kind of software, supporting every kind of user, and open to every kind of creator and developer. Our recent releases for 2D/3D multi-tasking, panel positioning, better hand tracking, Windows Remote Desktop integration, and Open Store have started to build momentum along this new path. Horizon OS is on track to be the first platform that supports the full spectrum of experiences from immersive VR to 2D screens, mobile apps, and virtual desktops—and the developer community is central to that success.

The next big step toward the metaverse will be combining AI glasses with the kind of true augmented reality experience we revealed this year with Orion. It’s not often that you get a glimpse of the future and see a totally new technology that shows you where things are heading. The people who saw Orion immediately understood what it meant for the future much like the people who saw first personal computers taking shape at Xerox PARC in the 1970s (“within ten minutes it was obvious to me that all computers would work like this someday,” Steve Jobs later said of his 1979 demo of the Xerox Alto).

Being able to put people in a time machine and show them how the next computing platform will look was a highlight of 2024—and of my career so far. But the real impact of Orion will be in the products we ship next and the ways it helps us better understand what people love about AR glasses and what needs to get better. We spent years working on user research, product planning exercises, and experimental studies trying to understand how AR glasses should work, and that work is what enabled us to build Orion. But the pace of progress will be much more rapid from here on out now that we have a real product to build our intuition around.

This has been the lesson time and again over the last decade at Reality Labs. The most important thing you can do when you’re trying to invent the future is to ship things and learn from how real people use them. They won’t always be immediate smash hits, but they’ll always teach you something. And when you land on things that really hit the mark, like mixed reality on Quest 3 or AI on glasses, that’s when you put your foot on the gas. This is what will make 2025 such a special year: With the right devices on the market, people experiencing them for the first time, and developers discovering all the opportunities ahead, it’s time to accelerate.

Publikováno - Napsat komentář

Boz to the Future Episode 22: The Future of Orion and Wearables With Guest Alex Himel

Welcome back for another episode of Boz To The Future, a podcast from Reality Labs. In today’s episode, our host, Meta CTO and Head of Reality Labs Andrew “Boz” Bosworth, is joined by Meta’s VP of Wearables Alex Himel.

Himel’s team works on some of the industry’s hardest technical problems, all in the service of helping to build the next computing platform. Like Bosworth, he’s a tenured leader at Meta, having filled numerous leadership roles across the business in his 15+ years at the company.

Bosworth and Himel cover a range of topics, including wearables, augmented reality, and how AI is accelerating and improving the kinds of experiences embedded in Brýle Ray-Ban Meta and future products.

They go deep on Orion and the positive reception it’s received since Mark Zuckerberg announced it at Připojení. Just as Steve Jobs recognized the Alto as a turning point in computing, Bosworth asserted that, while we may disagree about the details, it’s hard to argue against this as the future. Himel and Bosworth go on to talk about the tremendous value of prototyping and building, saying how much more they’ve learned about Orion and its input paradigms in the three weeks of using it compared to the years spent building it.

They also talked about the crucial role of AI in wearables, and how it’s becoming increasingly useful in everyday life—especially now that you’re able to ask Meta AI about things you’re seeing and use it to set visual reminders.

Himel also talks about his deep experience in a different kind of stack: sandwiches. With seven years under his belt working in the famous Lange’s Little Store in Chappaqua, NY, where he grew up, Himel and Bosworth get into what makes a great sandwich, where tomatoes really go, and which bread is superior.

You can tune in to Boz to the Future on Apple PodcastsSpotify, and wherever you listen to podcasts.

You can follow Himel on Vlákna. You can follow Bosworth on InstagramTwitter/XVlákna @boztank.