Prior to WWDC, there were reports and analyst predictions that Apple would eventually ship its own version of Google Glass as an AR accessory. Then Apple introduced ARKit as a tool to help developers create augmented reality experiences through iOS apps starting with iOS 11. Now what was previously a rather wild rumor is starting to make more sense…
Bloomberg had a detailed report on Apple’s plans for AR as well as smart glasses earlier this year saying that augmented reality features would come first to the iPhone then later through smart glasses.
Evidence of potential AR glasses hardware in testing was later discovered through an accident report that was unintentionally shared. While the glasses product is still reportedly over a year away from debuting, Apple’s ARKit introduction and other recent activity may give us a clue at how Apple AR glasses could be successful in a way that Google Glass was not.
It could start with design. Take Apple Watch for example. Apple showed that it can create a wearable gadget with a traditional look to create something people will actually wear. Google Glass had privacy concerns with its camera, but it’s cyborg look certainly didn’t help acceptance either.
Apple Watch blended a gadget with a traditional object in part by launching with a huge variety of styles through case colors and straps. Apple’s AR glasses could do this with frame styles to achieve a similar effect.
Apple Watch isn’t an add-on to your current watch, however, as it totally replaces it. The same would need to be true for Apple’s AR glasses which means prescription lenses would be necessary from the start.
Appearances aside, Apple’s approach seems to be discovering the need for AR glasses first before ever shipping the product. Google Glass had both design challenges and lacked a clear purpose outside of niche uses and trivial tinkering.
With ARKit, Apple is changing augmented reality from something mysterious and conceptual to something anyone with an iPhone or iPad will be able to understand and see applied. The feature hasn’t officially shipped yet and we’re already seeing real world examples in the wild.
Developers are already using ARKit to get experience developing around AR, and customers will discover examples of AR in iPhone apps. The next step is wanting an experience fully optimized for AR. The iPhone is portable and the iPad is immersive, but a wearable heads-up display is likely the ideal form factor.
While Google Glass ultimately may have been ahead of its time, it was Google’s order of operations that probably hurt the product the most. Google introduced the futuristic gadget first and then said build the experiences next. Apple is giving developers the tools to build the experience first on a platform that’s already huge. That may show us why we should care about AR in the first place and therefore want a better form factor.
I still think there’s still a possibility that AR glasses won’t be a mainstream product — wearing something on your face is a bigger ask than even wearing something on your wrist — but there is still potential for a hardware accessory optimized for ARKit with glasses being the form factor. Apple’s AR glasses could ultimately target the very specific markets and industries similar to how the Apple Pencil appeals primarily to designers, illustrators, and people who use the iPad for heavy productivity.
If ARKit does push Apple into shipping glasses, I can see convenience benefits to staying connected to data we already get from our smartphones and watches now without being taken out of our environment.
For example, I’ve thought about the benefit of an Apple version of Google Glass when walking through the airport with coffee in one hand and luggage in the other. I get an iMessage that I need to see, but I need to stop and set my coffee down or risk spilling it to glance at my wrist. A totally hands-free, visual solution here would be most convenient and does seem like the future.
Scott Forstall hinted at the very need for something like Apple AR glasses earlier this week when he was asked about AR, VR, and their future, saying that both technologies are interesting but will require the right form factor to take off.
Does anyone think that form factor is anything other than a heads-up display? And could that go mainstream without blending in with what we currently know as glasses?
Check out 9to5Mac on YouTube for more Apple news!