In a demo, a game developer showed me a game his company had created for Spectacles. Tracks how far you walk and overlays a gamified grid over your environment. As you walk, you collect coins that accumulate along your route. RPG-style enemies will also occasionally appear, which you can then fight with an AR sword that you wield by waving your hand in real life. However, you must hold the sword directly in front of you to keep it within the limits of that narrow field of vision, which means walking with your arm stiff and extended. The argument is that you can play this game while walking, which seems like a good way to accidentally hit someone walking on the sidewalk or get hurt chasing a coin in traffic.
Snap encourages users to avoid using AR that blocks their vision at times when they should not be distracted and to pay attention to their surroundings. But there are now no procedures in Spectacles that send a pop-up warning when something is in the way, or that prevent people from wearing the glasses while driving or operating heavy machinery.
People have been seriously injured while absentmindedly playing Pokémon Go, but Snap says this is a different use case. Holding your phone directly in front of you to catch a rare Snorlax is a problem because then you’re blocking your view with a device. Spectacles allow you to see the real world at all times, even through the magnified images in front of you. That said, I found that having a hologram in the middle of my vision can definitely be distracting. When I tried the walking game, my eyes focused more on the little cartoon collectibles floating around than on the ground in front of me.
This may not be a problem as long as the specifications are only in the hands of a few developers. But Snap is moving quickly and also wants to appeal to a broader range of buyers, likely in an effort to develop its technology before its rivals can take the AR prize.
After all, Meta’s AR efforts seem to be further along than Snap’s: lighter frames, more robust AI on the backend, and a slightly less ugly appearance. But there are some key differences between the way companies are trying to boost their burgeoning technology. Meta’s Orion glasses are actually controlled by three devices: the glasses on the face, a wristband with a gesture sensor, and a large disk, about the size of a portable charger, that does most of the processing for all the functions of the device. software. Unlike Meta’s glasses, Snap’s glasses are all packaged into one device. That means they’re larger and heavier than the Meta glasses, but also that users won’t have to lug around extra gear when they finally reach the real world.
“We think it’s interesting that one of the biggest players in virtual reality agrees with us that the future is immersive, transparent and wearable AR,” says Myers. “The glasses are quite different from the Orion prototype. They’re unique because they’re truly immersive AR glasses that are available now, and the developers at Lens Studio are already creating amazing experiences. The glasses are completely self-contained, requiring no additional drives or other devices, and are built on a foundation of proven and commercialized technology that can be produced at scale.”
Snap’s goal is to make its Spectacles intuitive, easy to use, and easy to carry. It will take them a while to get there, but they are well on their way to those three points. All they have to do is lose some weight. Maybe add some color. And keep people out of traffic.