
Follow ZDNET: Add us as a preferred source on Google.
ZDNET’s key takeaways
- Lumus’ ZOE prototype pushes past a 70-degree field of view and feels like a wraparound display.
- The updated Z-30 prioritizes everyday readability.
- Z-30 2.0 targets thinner, lighter waveguides to support slimmer frames that look more normal.
Lumus grabbed attention at CES 2026, showcasing its waveguide breakthroughs that push field of view and thinness further than I’ve seen before in a smart glasses form factor. I spent hands-on time with each of their big announcements, including a fragile prototype held together by tape at the edges.
The company is building directly on its major success supplying its waveguide technology to Meta Ray-Ban Display glasses, and proving that geometric waveguides work at consumer scale with standard glass. At CES, Lumus showcased a ZOE prototype with more than 70 degrees of field of view, an optimized Z-30 with 40% more brightness, and a Z-30 2.0 preview that’s 40% thinner. David Goldman, VP of marketing, walked me through each demo with clear enthusiasm about the progress Lumus is making.
Also: CES 2026: 7 biggest news stories across TVs, laptops, and other weird gadgets you missed
Meta Ray-Ban Display uses Lumus 20-degree waveguide lenses delivering 5,000 nits brightness to compete with bright daylight, helping to validate consumer appetite and expectations for AR glasses. “The feedback on the display side has been incredible,” he said. As evidenced by my time on the show floor, the success of the displays inside Meta’s glasses is helping drive other companies to chase similar form factors and solutions.
Z-30 impressions
We started the demos with the optimized Z-30, an 11-gram optical engine that hits 8,000 nits per watt efficiency. Test patterns blanketed the 30-degree field of view with sharp text against white backgrounds. I could read dense 8.5-point fonts from Alice in Wonderland easily and saw no distortions or color shifting. A Muppets demo floated transparently with vivid colors that really popped, and Goldman confirmed his own experience with clear eye visibility through the lens while watching Avatar in 3D.
Also: CES 2026: These 7 smart glasses caught our eye – and you can buy this pair now
The one-by-one aspect ratio made images feel significantly larger despite a modest increase in field of view compared to Meta’s 20 degrees. The Z-30 2.0 preview has 40% thinner glass and 30% lighter weight, potentially reducing manufacturing steps while increasing yield from raw materials. This tech targets all-day notification, navigation, and translation use cases, where lightweight comfort matters most. I was impressed by how vibrant and clear these lenses looked while also being incredibly thin and light.
ZOE breakthrough tech
The star of the show was the 70-degree ZOE prototype. In fact, Goldman handed the sole surviving unit to me with extreme care, as the company had already lost two. “This is the earliest prototype that we’re showing publicly of 70 degrees,” he explained. A cyberpunk airship immediately created wraparound immersion, filling my vision. I was taken aback by how much of my view was covered with such a simple and standard eyeglasses form factor.
Also: I wore the world’s first HDR10 smart glasses, and they can easily replace my home TV
The test patterns delivered sharp clarity and vivid colors at 1080p resolution, appearing to me to be even crisper than the Z-30 despite lower brightness. The prototype uses basic glass and optics, proving that a wide field of view can work without exotic materials that can drive up costs. ZOE targets immersive spatial entertainment, multi-app productivity, gaming, and even defense applications where soldiers need maximum situational awareness.
I could easily see myself using this prototype to relax and watch a 3D movie. Though the display is see-through, I was struck by just how much of the world I didn’t see through the on-display content. At first, I was concerned about safety issues that might arise when displays occupy this much space in the real world. However, upon further thought, I began to see the benefits of that wide coverage, particularly as systems become more capable of embedding imagery into and around real-world objects. Not every pixel needs to be lit up at all times, and a wider field of view like this means a broader canvas on which AR elements can operate.
I asked Goldman about prescription lenses, since 70% of people need vision correction. Lumus waveguides can bond prescription lenses directly to the glass, skipping the air gaps that let dust and moisture sneak into competing designs. Its mirror-based approach keeps light traveling straight, which holds onto true colors and saves battery compared to diffractive rivals. AR coatings drop forward leakage to almost zero, so bystanders are unlikely to see glowing rainbows that give away what is being shown inside the glasses.
Also: These smart glasses beat the Meta Ray-Bans for me with useful features and a cheaper price
Lumus now covers everything from 20-degree glances to its early work with 70-degree immersion, meaning manufacturers can pick tech that fits each use case and form factor. Some of these demos felt endearingly fragile, but the optics performed beautifully, leaving me genuinely excited about how capable performance in normal-looking AR glasses will continue to be.
Comments are closed, but trackbacks and pingbacks are open.