Meta Connect 2025: $799 AR glasses, neural band, and the race Apple can't afford to lose
Sep 18, 2025
Key Points
- Meta's $799 Ray-Ban smart glasses with neural wristband represent a decisive hardware move Apple cannot ignore, subsidizing aggressively to establish platform dominance before competitors ship consumer products.
- The neural band distributes processing load to the wrist, enabling all-day wearability and opening a monetizable layer between consumer intent and merchant fulfillment through agentic commerce.
- Apple faces a timeline gap: its first display glasses won't ship until 2027 at earliest, while Meta will iterate twice more and saturate the market with billion-dollar capital advantage Apple cannot match.
Summary
Meta's Ray-Ban smart glasses with display, priced at $799, arrive with a neural wristband for input, real-time AI translation, and video calling. The glasses weigh 69 grams, lighter than the previous Orion prototype, and distribute processing across the body rather than concentrating weight on the bridge of the nose. This design choice is critical for all-day wearability.
The $799 price point surprised observers including Ben Thompson and Mark Gurman. Orion reportedly cost $10,000 to manufacture, making Meta's ability to hit this price on a consumer product with comparable technology a significant engineering achievement. Meta is likely willing to subsidize hardware—a $200-per-unit loss on a million-unit run costs $200 million, manageable for Meta's cash position—because the real value lies in owning the next computing platform, not hardware margins.
Demo failures and product confidence
Zuckerberg's botched WhatsApp video call demo on stage generated immediate commentary, but the failure itself matters less than the confidence required to attempt it. Meta executives conducted the demos successfully months before the keynote. The on-stage failures likely stemmed from simultaneous live streaming taxing the same hardware subsystems. The company is willing to risk public failure in service of authenticity, a contrast to Apple's move toward fully pre-recorded demos and Google's practice of speeding up LLM response times in real-time video demos.
Three structural insights
Wearables are underhyped relative to the AI hype cycle. While everyone fixates on large language models and AGI narratives, Meta is executing in a space with lower expectations and more tangible near-term returns. The glasses work—cameras function, AI translation runs in real time, battery lasts an hour under continuous load—without requiring breakthrough capability.
Personal super intelligence is emerging as an agentic commerce play. The neural band lets users point at products and order them. Integration with Shopify and brand direct-to-consumer flows is likely coming. Sam Altman already showed a ChatGPT dashboard with an Orders tab. This is extremely monetizable for whoever controls the intermediary layer between consumer intent and merchant fulfillment.
Cinema will be the killer app for VR. James Cameron, who tested Quest hardware, validates the thesis that watching Avatar in 3D plays better in VR than on a television. Meta is working toward Quest 4 with display fidelity at or beyond Apple Vision Pro's level, stripped of the external screen, battery pack, and other unnecessary components. A $500 VR cinema device competing directly with 65-inch TVs is plausible within the same price envelope.
The neural band
The wristband input device is the second-most-overlooked announcement after the glasses themselves. Removing sensors from the face and placing them on the wrist distributes physical load and eliminates the need for cameras on the headset. Input gestures—hand tracking via wristband sensors—are familiar iPhone-like motions, requiring minimal learning curve. Long-term, Meta plans to open the neural band to third-party developers and potentially make it platform-agnostic, though that direction remains unclear.
Voice input will dominate over handwriting. Users already dictate routinely into phones, and low-whisper speech capture works reliably. Handwriting on a wristband may account for 20% of total input, useful for brief confirmations like yes or no and quick emoji selection, but unlikely to become primary.
Apple's timeline problem
Mark Gurman reports Apple is likely announcing its first non-display smart glasses in late 2026 or early 2027, with display glasses still years away. Apple is developing camera-equipped glasses with tighter Bluetooth integration—essentially better-built Ray-Ban competitors powered by Siri and Apple Intelligence. Siri remains unreliable as a query interface, with users still opening ChatGPT manually. Apple Intelligence's feature rollout has been delayed and fragmented. Meanwhile, Meta will iterate twice or more before Apple ships its first consumer product.
Meta can absorb billion-dollar hardware subsidies to saturate the market and establish software moats through integration with Instagram, WhatsApp, and an ecosystem of apps. Google is talking about display glasses too, but execution matters more than announcements.
Hardware as loss leader
Meta has no need to make money on hardware margins. By capping production at one million units, subsidizing aggressively, and cycling to the next generation every 12-18 months, the company avoids both supply constraints and margin pressure. When V2 ships—lighter, thinner, faster display—the price creeps up gradually. The consumer never sees a dramatic price jump; instead, they perceive continuous improvement. Apple, by contrast, needs hardware to be profitable and typically reaches consumer price points only after years of iteration. In a race where first-to-scale matters, Meta's capital advantage is decisive.