That spatial awareness is fundamental to the whole augmented-reality experience. Snap engineers know this as well as anyone, since the company built seemingly cheeky but technologically impressive AR filters in Snapchat long before bigs like Apple and Google introduced their AR frameworks for phones. Does Snap’s AR tech port well to glasses? Yes and no.
The glasses themselves are a stark interpretation of “wearable” technology. The extra-wide frames horizontally dwarfed my face when I tried them on at a spacious house in Silicon Valley in late April. I felt and heard my eyelashes brushing up against the lenses like mini squeegees. Where Snap’s earlier Spectacles were playful, with round frames and colorful rims around the camera lenses, these hard-angled specs are purposeful. (My editor thought they looked cool in the selfie I sent him; I personally wouldn’t wear them as anything other than a statement of Snap’s newest thing.)
XR developer Don Allen Stevenson III wears the new Spectacles.
Photograph: Phuc Pham
AR creator Clay Weishaar.
Photograph: Phuc Pham
“Our vision was to create a device that’s expressive, thought-provoking, and maintains a lightweight sunglasses form factor,” Lauryn Morris, a product strategy manager at Snap, told me over a video call. Thought-provoking, sure—I’m still thinking about the best way to describe them—but at 4.7 ounces, they’re more than double the weight of your standard Ray-Ban Wayfarers.
The weight is one of the many trade-offs of AR glasses; they’re packed with tech. The lenses are stereo color displays that automatically adjust for brightness, up to 2,000 nits. The imagery that appears in front of the wearer’s eyes is generated by dual optical waveguides, and there are two RGB cameras built into the specs to capture the peripheral world. Add to that four built-in mics for voice control, a pair of stereo speakers for spatial audio, and a touchpad on the right temple for navigating app interfaces. The glasses are capable of inside-out tracking—which means they’ll “see” your hands as you gesture through the air, and interpret their movements—but none of the earliest AR Lenses for Spectacles are utilizing this function yet.
See What’s Next in Tech with the Fast Forward Newsletter
From artificial intelligence and self-driving cars to transformed cities and new startups, sign up for the latest news.
Snap didn’t build these AR specs entirely from scratch (although, according to Spiegel, they were conceived of years ago, back when the first Spectacles were being mapped out). They’re built on Qualcomm’s XR1 platform, a dedicated system-on-a-chip and a series of reference designs for “extended reality” glasses. Snap is touting its custom Spatial Engine as a unique piece of technology, software that fuses together all of the positioning information being sucked in by the glasses to make apps feel realistic.
But those trade-offs, the compromises every glasses maker seems to make in this awkward stage of AR, are part of the reality of augmented reality. The 26.3-degree diagonal field of view on the glasses is smaller than the FOV on other head-up displays, such as Magic Leap and Microsoft HoloLens, and the touchpad needs some fine-tuning. Fortunately, there’s also a voice control option, which performs well when it works.
The battery on this developer device lasts for just 30 minutes. Snap’s thoughtfully designed carrying case doubles as a portable charger, but good luck wearing them for longer than 30 minutes: In the short hour I wore the Spectacles, I saw three warnings that the glasses had overheated. Also, there are no physical volume buttons on the specs, so you’ll have to thumb through the Snap smartphone app to control the sound levels.
The new Spectacles are not for sale just yet. Snap is making them available to AR developers who apply to be part of its early testing program.
Photograph: Natalja Kent