A year ago, AI glasses still sounded like the latest XR side-quest. Today, Meta’s Ray-Ban Display Glasses are quietly forcing hard capacity decisions in waveguides, microdisplays, and optical engines, reshaping near-eye display roadmaps all the way out to 2030.
TrendForce’s latest near-eye research paints a clear before-and-after picture of Meta’s program. Meta initially treated Ray-Ban Display Glasses as a cautious market test, placing around 80,000 unit orders for Lumus geometric waveguide optics. Within six months of launch, Meta had lifted those waveguide orders by 87.5% to roughly 150,000 units, as sell-through at the channel level outpaced internal forecasts. Orders for other key components, including OmniVision’s LCoS near-eye microdisplays, GoerOptics’ optical engine assemblies, and Schott’s waveguide-related glass parts, climbed in parallel. Upstream display and assembly partners are now re-tuning production lines and yields to handle higher volumes, aligning around Meta’s phased target of shipping a cumulative 500,000 Ray-Ban Display units over roughly the next two years.
TrendForce has responded by revising its AR glasses view upward. Global AR glasses shipments are now forecast at about 950,000 units in 2026, a 53% year-on-year increase, with Meta as the primary driver. With more major vendors entering the space, TrendForce projects AR glasses shipments to reach around 32.11 million units in 2030. For near-eye displays, this is the inflection point between extended pilot and real category.
If Meta’s internal caution defined the first half-year, external coverage suggests that today’s constraint is supply, not demand. Reports indicate that Meta has paused or delayed an international rollout of Ray-Ban Display Glasses to markets including the UK, France, Italy, and Canada, citing extremely limited inventory and long waitlists. Meta is prioritizing the US market while it catches up, effectively throttling demand to match waveguide and module capacity. At the same time, financial and supply-chain reporting indicates that Meta and EssilorLuxottica have discussed scenarios in which annual Ray-Ban smart glasses output could roughly double toward the 20-million-unit range if supply and demand both justify it. Those numbers sit well above TrendForce’s current AR glasses forecast path, underlining the gap between conservative industry baselines and what a single aggressive platform owner might eventually try to pull through the stack.
Technically, Ray-Ban Display marks a shift away from the full-field 3D AR ideal toward a lighter-weight, HUD-centric near-eye model, with direct implications for display choices. Meta’s own CES 2026 communications describe Ray-Ban Display as its most advanced AI glasses yet, combining a monocular, right-eye display with Meta AI for live translation, navigation, visual query, and creator-oriented teleprompter modes. Reports describe a private near-eye image with under 2% light leakage, driven by an optical engine built around LCoS and a Lumus geometric waveguide, housed in a Ray-Ban-style frame. A surface-EMG Meta Neural Band wrist device adds fine-grained input via subtle finger motions and air-typing, sidestepping the need for large touch surfaces or mid-air gesture cameras. Early reviews frame the value as almost magical precisely because the display is glanceable and context-aware rather than immersive, an always-on, near-eye AI HUD rather than a full mixed-reality headset. That fits with TrendForce’s categorization of Ray-Ban Display as AR/AI glasses, where display technology is being optimized for readability, power, and social acceptability, not for wide-FOV, stereoscopic 3D. The key takeaway for the display industry is that near-eye display success here is tied to invisibility as much as to image quality: the less the hardware calls attention to itself, the better.
Behind the product sits a long-term platform bet. Meta and EssilorLuxottica extended their smart glasses agreement in 2024 into a long-term partnership into the next decade, covering multiple generations of Ray-Ban branded and other smart eyewear. EssilorLuxottica has told investors that smart eyewear is a strategic growth vector, with wearable visual technology expected to define a new category in the optical market. Mark Zuckerberg has repeatedly said that Meta wants glasses to become the next major computing platform, with fashion and comfort as non-negotiable requirements rather than afterthoughts. This aligns with TrendForce’s observation that, as the novelty effect fades, buyers will increasingly prioritize comfort, pricing, and a robust content and AI ecosystem. Brands that can fuse those elements with credible display and optics roadmaps, rather than just ship a tech demo, are best placed to capture the multi-decade growth TrendForce projects.
TrendForce situates Meta’s progress within a broader competitive wave. Samsung and Google are both preparing AR glasses launches, and other consumer electronics players are quietly lining up their own AI glasses concepts. Smart glasses shipments more than doubled year-on-year into 2025, with AI-enabled models dominating the mix. For the display supply chain, the message is simple: Meta’s Ray-Ban Display Glasses are no longer a one-off curiosity. They are the leading edge of a multi-vendor, multi-platform near-eye market that is already stressing waveguide and microdisplay manufacturing.
For panel, optics, and module players, this story translates into three concrete imperatives. First, yield and scalability: waveguides, LCoS, and compact optical engines must move from boutique volumes to hundreds of thousands and then millions of units without catastrophic yield loss. Second, BOM discipline: as Meta and others push toward more affordable SKUs, near-eye display solutions must sustain ASPs that fit within mass-market eyewear price bands, not headset-class premiums. Third, ecosystem mindset: display performance will be judged in the context of AI features, latency, and comfort, not just nits or pixels-per-degree, reshaping how premium is defined in near-eye applications.
Meta’s Ray-Ban Display Glasses have effectively turned near-eye from a technology showcase into a supply-constrained business with a visible multi-year ramp. The task now is to map where in the stack that ramp intersects individual roadmaps, and how fast suppliers can move before the next generation of AI glasses redefines the baseline again.
