ARKit Review: The first big step

We should probably stop calling these things in our pockets "phones" at this point.

Our world has become dominated by swipes and taps. The next generation of iPhone users will see it as their first computer, the thing they use to learn and laugh and share in the same way the previous generation sat down at a desk and reached out to a mouse and keyboard. In much the same way, that mouse and keyboard input has become a secondary form of input for many of our day-to-day computing tasks, gazing down at flat representations of data will eventually feel like a less convenient way to interact.

By enabling tens of millions of people to have and share in hundreds of new kinds of experiences overnight, Apple clearly aims to steer us toward that future with ARKit. And like swiping or tapping, there will be no single button flip that pushes people toward this new world all at once. Instead, what we're looking at is phase one of a long plan for changing the way we interact with these things we call phones.

Explaining Augmented Reality through ARKit

In a weird way, this "review" of ARKit is a little self-defeating. Augmented Reality isn't cool or useful because it is Augmented Reality, and it's not successful just because it exists. Augmented Reality is most successful when it simply exists and works, the most popular example of this in recent memory being the Niantic games Ingress and Pokémon Go. These games aren't popular because they are made with Augmented Reality technologies; they are popular because each of these experiences encourages you to use your phone as more than just a screen with buttons on it.

ARKit as a collection of technologies encourages developers to think about the rest of the world around the phone. Instead of a big game map you have to pinch and swipe around on, what happens when the whole Dining Room table is the map, and you see it all by moving yourself around the table? What if you didn't have to go looking for a tape measure when trying to measure for a broken piece of fence, because you have your phone and it can measure objects in the real world for you? Neither of these concepts are unique in the world of Augmented Reality, but through ARKit, developers know exactly what the iPhone and iPad are capable of and can build within those boundaries instead of testing around them.

The whole idea here is for Augmented Reality to be the How instead of the Why, and currently that means showing users the benefits of using their phone in new ways. ARKit apps largely exist in three basic interaction methods:

  • Point me at something: Apple's software uses the camera to find a flat surface, and when one has been discovered you can "put" something on that surface.

  • Leave something in the real world: You can "put" something in the world, and have it stay right where you told it to stay. You can now walk around this thing, and see every angle of it as though it were really there.

  • Adding to the real world: Your phone is aware of every motion made, and can use this information to either add or replace things in the real world for you to see, encouraging you to move around and create something new.

Check out our favorite ARKit apps!

These options create a vibrant, healthy mix of apps on the App Store. Some apps go all in on ARKit, making new experiences that blend all of these ideas together. Some take a single concept and add it to an existing app to offer a new way to use the existing features.

The whole point is ARKit as a name shouldn't be something users are actively looking for soon, because these features will simply become ubiquitous. In fact, you may not be aware of this, but if you've used an iPhone 8 Plus camera you've already used Augmented Reality without knowing it. Portrait Lighting, especially the versions of it that remove the world around you and replace it with a flat back background, is a great example of invisible Augmented Reality.

Depth, Sensors, and iPhones

Not all iPhones are created equal, and this raises a lot of questions about how different the Augmented Reality experience is when you use the iPhone 8, iPhone 8 Plus, and iPhone X. The iPhone 8 is the only one of these three phones with a single camera on the back and none of the depth-sensing goodies like Portrait Lighting. How much of the Augmented Reality experience do you really gain or lose with each of these phones?

Right now, the answer is not very much. Some of the obvious ones you already know, like Animoji and Portrait Lighting selfies on the iPhone X. The depth sensors on the front of that phone enable things you simply can't do with a single sensor. For the time being, very few apps in the App Store are using the depth features afforded by using an iPhone 8 Plus or iPhone X. That will likely change as developers do more with ARKit, but even then a vast majority of ARKit apps are going to run exactly the same on all of these phones.

This is largely true of the iPhone 7 series as well. The previous generation may not be as capable when it comes to processing power, and so may take a moment or two longer to detect a surface, but playing games feels the same unless you're holding an iPhone 7 right next to an iPhone 8 and looking for differences.

This is especially true of accuracy in Augmented Reality. You won't be able to measure something better with the dual cameras, and if something goes wrong and the phone "forgets" the position of an Augmented Reality object, it goes wrong the same on the iPhone 8 and iPhone 8 Plus. Apple says the iPhone X has been custom tuned for ARKit, so it's possible that phone will be better in small ways, but we won't know for sure until the iPhone X is here.

A marathon, not a sprint

How do you measure the future? Is it through the excitement of those you allow to peek at what might come, or is it by accomplishing something previously considered impossible? Does the latter actually matter if you don't have the former? These are a small sample of the questions flying through my mind when I think about Augmented Reality.

ARKit is great because so many more people are already using it, not because it is anywhere near as complete as competing platforms. In practice, ARKit standalone apps have some fairly glaring flaws. Augmented Reality objects placed in the world frequently drift away from where they were placed. Most of the measuring apps I tested were either unable to offer the same measurement for the same object multiple times or didn't measure in context. Each of these experiences are impacted by lighting, how steady your hand is, and what the surfaces around you are made of. Carpet, for example, is frequently difficult to measure as a flat surface.

When it does work, Augmented Reality on the iPhone is a portal to another world. It's pretty similar to your world, but there are some extras that really make you wish you could move there. More than anything, I hope the next step for Apple is to make this new world something I can share with others. I want to be able to drop a virtual beacon in a crowded theater and have my friends be able to find me by lifting their phones up and seeing that beacon. I want to be able to collaborate with others in this Augmented Reality, and right now ARKit feels like the best option we have right now for making that something a lot of people are actually going to want to do.

Comments are closed.