Meta Connect 2025: VR still underwhelms; will smart glasses alternatively thrive?

Does an integrated display, coupled with a clear view of what’s around you IRL (vs VR headsets), get smart glasses to “smartphone killers”? The post Meta Connect 2025: VR still underwhelms; will smart glasses alternatively thrive? appeared first on EDN.

Meta Connect 2025: VR still underwhelms; will smart glasses alternatively thrive?

For at least as long as Meta’s been selling conventional “smart” glasses (with partner EssilorLuxottica, whose eyewear brands include the well-known Oakley and Ray-Ban), rumors suggested that the two companies would sooner or later augment them with lens-integrated displays. The idea wasn’t far-fetched; after all, Google Glass had one (standalone, in this case) way back in early 2013:

Meta founder and CEO Mark Zuckerberg poured fuel on the rumor fire when, last September, he demoed the company’s chunky but impressive Orion prototype:

And when Meta briefly, “accidentally” (call me skeptical, but I always wonder how much of a corporate mess-up versus an intentional leak these situations often really are) published a promo clip for (among other things) a display-inclusive variant of its Meta Ray-Ban AI glasses last week, we pretty much already had our confirmation ahead of the last-Wednesday evening keynote, in the middle of the 2025 edition of the company’s yearly Connect conference:

Yes, dear readers, as of this year, I’ve added yet another (at least) periodic tech-company event to my ongoing coverage suite, as various companies’ technology and product announcements align ever more closely with my editorial “beat” and associated readers’ interests.

But before I dive fully into those revolutionary display-inclusive smart glasses details, and in the spirit of crawling-before-walking-before-running (and hopefully not stumbling at any point), I’ll begin with the more modest evolutionary news that also broke at (and ahead of) Connect 2025.

Smart glasses get sporty

Within the midst of my pseudo-teardown of a transparent set of Meta Ray-Ban AI Glasses published earlier this summer:

I summarized the company’s smart glasses product-announcement cadence up to that point. The first-generation Stories introduced in September 2020:

was, I wrote, “fundamentally a content capture and playback device (plus a fancy Bluetooth headset to a wirelessly tethered smartphone), containing an integrated still and video camera, stereo speakers, and a three-microphone (for ambient noise suppression purposes) array.”

The second-generation AI Glasses, unveiled three-plus years later in October 2023, which I own.—two sets of, in fact, both Transitions-lens equipped:

make advancements on these fundamental fronts…They’re also now moisture (albeit not dust) resistant, with an IPX4 rating, for example. But the key advancement, at least to this “tech-head”, is their revolutionary AI-powered “smarts” (therefore the product name), enabled by the combo of Qualcomm’s Snapdragon AR1 Gen 1, Meta’s deep learning models running both resident and in the “cloud”, and speedy bidirectional glasses/cloud connectivity. AI features include real-time language Live Translation plus AI View, which visually identifies and audibly provides additional information about objects around the wearer.

And back in June (when published, written early May), I was already teasing what was to come:

Next-gen glasses due later this year will supposedly also integrate diminutive displays.

More recently, on June 20 (just three days before my earlier coverage had appeared in EDN, in fact), Meta and EssilorLuxottica released the sports-styled, Oakley-branded HTSN new member of the AI Glasses product line:

The battery life was nearly 2x longer: up eight hours under typical use, and 19 hours in standby. They charged up to 50% in only 20 minutes. The battery case now delivered up to 48 operating hours’ worth of charging capacity, versus 36 previously. The camera, still located in the left endpiece, now captured up to 3K resolution video (albeit the same 12 Mpixel still images as previously). And the price tag was also boosted: $499 for the initial limited-edition version, followed by more mainstream $399 variants.

A precursor retrofit and sports-tailored expansion

Fast forward to last week, and the most modest news coming from the partnership is that the Oakley HTSN enhancements have been retrofitted to the Ray-Ban styles, with one further improvement: 1080p video can now be captured at up to 60 fps in the Gen 2 versions. Cosmetically, they look unchanged from the Gen 1 precursors. And speaking of looks, trust me when I tell you that I don’t look nearly as cool as any of these folks do when donning them:

Meta and EssilorLuxottica have also expanded the Oakley-branded AI Glasses series beyond the initial HTSN style to the Vanguard line, in the process moving the camera to the center, above the nosepiece, otherwise sticking with the same bill-of-materials list, therefore specs, as the Ray-Ban Gen 2s:

And all of these, also including a welcome retrofit to the Gen 1 Ray-Ban AI Glasses I own, will support a coming-soon new feature called conversation focus, which “uses the glasses’ open-ear speakers to amplify the voice of the person you’re talking to, helping distinguish it from ambient background noise in cafes and restaurants, parks, and other busy places.”

AI on display

And finally, what you’ve all been waiting for: the newest, priciest (starting at $799) Meta Ray-Ban Display model:

Unlike last year’s Orion prototype, they’re not full AR; the display area is restricted to a 600×600 resolution, 30 Hz refresh rate, 20-degree lower-right portion of the right eyepiece. But with 42 pixels per degree (PPD) of density, it’s still capable of rendering crisp, albeit terse information; keep in mind how close to the user’s right eyeball it is. And thanks to its coupling to Transitions lenses, early reviewer feedback suggests that it’s discernible even in bright sunlight.

Equally interesting is its interface scheme. While I assume that you can still control them using your voice, this time Meta and EssilorLuxottica have transitioned away from the right-arm touchpad and instead to a gesture-discerning wristband (which comes in two color options):

based on very cool (IMHO) surface EMG (electromyography) technology:

Again, the initial reviewer feedback that I’ve seen has been overwhelmingly positive. I’m guessing that at least in this case (Meta’s press release makes it clear that Orion-style full AR glasses with two-hand gesture interface support are still under active development), the company went with the wristband approach both because it’s more discreet in use and to optimize battery life. An always-active front camera, after all, would clobber battery life well beyond what the display already seemingly does; Meta claims six hours of “mixed-use” (                                                            <div class= Read Original