ARKit and Surface Measurement

By August 1, 2017Uncategorized


At WWDC2017 Apple announced to developers they would be able to access features of the new ARKit framework. The framework is designed to facilitate the building of Augmented Reality experiences on iOS devices. It utilizes Apples A9 – A10 chips and will presumably work with increasing sophisticated sensors that are being developed for upcoming iPhone releases.

Digging into some of the documentation around the beta version of ARKit, apple provides documentation on how to place objects with an 3D augmented reality context.

Once we place those objects or planes we can get the x,y,z coordinates of each node we created. By then doing a little bit of vector math, we can compute the distance of the two points on a two demential plane.

The Demo

A video of the demo we built is featured here, ARKit is calibrated in meters so we need to convert that to inches. According to the app, the long side of the rug measured 1.95 meters or 76.77 inches. The actual distance as measured by the tape measure is 77 inches. Being off by less than 1/4 of an inch is shockingly accurate and could potentially be attributed to where I dropped the circles in the scene.

Looking forward to testing out additional ARKit functionality as it evolves.