Introducing ARKit

Apple has just released an augmented reality (AR) framework. This will allow millions of app developers to make AR experiences on iOS for hundreds of millions of devices – making ARKit + iOS the biggest AR platform. Even Pokemon GO will use it.

Visual Inertial Odometry

The magic that powers ARKit is called Visual Inertial Odometry – VIO from now on. It uses VIO to accurately track the world around it and make an engaging experience. VIO combines information from the iOS device’s motion sensing hardware with computer vision analysis of the scene visible to the device’s camera. These two inputs allow the device to sense how it moves within a room with a high degree of accuracy, and without needing to calibrate anything. It just works!

This technology helps fulfill the basic requirement for any AR experience—and the defining feature of ARKit— and that is the ability to create and track a correspondence between the real-world the user inhabits and a virtual space where you have visual content. When your app displays that content together with a live camera image, the user experiences augmented reality: the illusion that your virtual content is part of the real world. Read more about this here: Understanding Augmented Reality – Discover concepts, features, and best practices for building great AR experiences in Apple’s Developer Reference.

Scene Understanding and Lighting Estimation

With ARKit, iPhone and iPad can analyze the scene presented by the camera view and find horizontal planes in the room. ARKit can detect horizontal planes like tables and floors, and can track and place objects on smaller feature points as well. ARKit also makes use of the camera sensor to estimate the total amount of light available in a scene and applies the correct amount of lighting to virtual objects.

Every AR experience built with ARKit requires a single ARSession object. If you use an ARSCNView or ARSKView object to easily build the visual part of your AR experience, the view object includes an ARSession instance. If you build your own renderer for AR content, you’ll need to instantiate and maintain an ARSession object yourself.

In order to to receive captured video frame images and tracking state from an AR session you need to implement and the ARSessionDelegate protocol. Which in essence helps you keep track of the real world giving you ARFrame objects that encapsulate an image of the world + some information, including some ARAnchors which are virtual objects that represent positions in the real world. One of the already made anchors is the ARPlaneAnchor which you receive when the session detects a plane like a table or the ground.

Read more by exploring the documentation.

High Performance Hardware

Because ARKit runs on the Apple A9 and A10 processors, you have breakthrough performance for fast scene understanding or building detailed and compelling virtual content on top of real-world scenes.

How to get it? 

If you have a payed membership go here and download Xcode 9 – or request acces for Swift Playground 2 that supports Swift 4 and the iOS 11 SDK including ARKit and access to the camera.

1 Star2 Stars3 Stars4 Stars5 Stars (13 votes, average: 4.38 out of 5)
Loading...

  1 comment for “Introducing ARKit

  1. June 5, 2017 at 9:52 pm

    Can’t wait to try it!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Subscribe
We send about one email per week with our latest tutorials and updates
Never display this again :)