It’s always inspiring to watch an Apple presentation. The unique synthesis of cutting edge technology, ethereal design and pop-culture coolness … it’s hard not to go into fangirl mode. These guys are masters of the art. Apple presentations are the experience of presents at Christmas combined with watching a chess grandmaster completely dominate the game.
But — lurking beneath the surface of the Apple presentation were a number of hints that point to an interesting future. Nearly every big technical leap was to do with chips. And many of the products they announced looked like a step towards AR. The iPhone X today may be fairly incremental, but it’s a starting point for what the iPhone will evolve into over the next ten years.
Almost completely glossed over in the presentation was the fact Apple has created its own GPU. This is an essential building block for an AR or VR headset it might release later down the line, and significant because Apple has historically been behind compared to Microsoft in graphics.
The other big chip mentioned in the presentation was the one used to do machine learning for face recognition. A good portion of the iPhone X demo was about FaceID, using face recognition to unlock the phone. The more interesting application, however, was using face data to animate emojis. This is definitely a test for future augmented reality software.
A big change with the iPhone X was it’s new bar of sensors — infrared camera, flood illuminator, proximity sensor, ambient light sensor, front camera, dot projector, speaker and microphone. They’re been used for FaceID now, but really they’ll be useful for the SLAM (simultaneous location and mapping) that AR software needs to accurately construct a model of the real world. It’s sort of like Google Tango — but it’s unclear if this sensor bar will stay on the iPhone or be incorporated into a future headset.
Apple Watch and Airpods seem to have been promoted to first class products, moved from ‘experimental’ to ‘scaling’ mode. It’s interesting that the new Apple Watch has built-in cellular — that means any future headset will be able to access cellular functionality using bluetooth via the watch, meaning it can be lighter. It’s likely Apple will want to unbundle more of the cumbersome features of an AR headset into other wearables.
There was one explicit AR gaming demo in the Apple presentation, and it actually wasn’t that impressive. Mobile AR is likely going to be a test bed for developers until they get the real thing. When, sometime in the next few years, Apple releases a headset, there will already be a community of developers who know how to use ARKit.
The thing I liked most about this presentation was that it tied together a bunch of hardware and software trends that have been happening in tech for the past few years (wearables, sensors, machine learning, graphics, VR) and brought them together.
We’re in a bit of a lull in the tech cycle, and this presentation finally gave us an answer for what much of the past few years in tech has been leading up to. I’ve been waiting for some tech trend to get excited about again, and now I feel that trend is AR.