iPhone X Depth Data Experiments

By | Uncategorized | No Comments

Made some updates to this post to reflect the release of iPhone XS . . .

The iPhone features a front facing “TrueDepth” camera utilizing an inferredemitter that projects over 30,000 dots. An inferred sensor measures the time it takes the light to bounce back allowing an extrapolation of distance and therefor depth. This technique is called time of flight or ( TOF ) and it’s the same phenomena that allows for laser measurement to work, and bats to fly. iOS uses the data to accurately depth map your face, allowing for a secure and accurate method of facial detection. In addition iOS utilizes TrueDepth data in portrait mode to generate an accurate depth of field.

By contrast the rear facing camera currently does not use a TOF, instead it uses duel camera and another phenomena called the “parallax effect” to extrapolate the depth of field. Single lanes phones like the iPhone XS, go a step further by post processing the image using machine learning algorithms to determine a depth of field map simply from pixel data. So if you’re wondering why portrait mode selfies come out significantly better than the rear facing shots, it has allot to do with the TrueDepth camera data. If you have a single lens iPhone XR, you will also notice that the rear facing camera in portrait mode will only let you take portraits of faces with very shallow angles, by contrast the front facing and duel cameras can take photos of inanimate objects as well as people.

Since the release of iOS 12, not only do developers have access to ARKit, but we also no have access to pixel buffer data. This mean we can visualize, manipulate and uses the raw depth data. I ran a few experiments to visualize some of the available data.

I am using scene-kit to visualize the point cloud data that we extract from still portrait mode photos. Notice the subtle difference between the front facing (TrueDepth ) and rear facing camera. Surprisingly the true depth data seems to be less accurate than the camera data. This may have something to do with the way the point cloud data is being captured.

Here I are taking raw depth data from the video buffer and displaying it in a view as shades of grey. Raw depth data is a bit noisy, so you can see the black artifacts in the video. In this instance you can clearly see that the depth data output from the TrueDepth sensor is superior to that of the duel camera setup.

Finally we can use some of the inbuilt depth-kit functions to smooth out and filter noise. With the depth filter on, we can capture a fairly accurate depth map of the scene. Additionally can also apply a histogram to the output to calibrate our depth map.

AR.js

By | Uncategorized | No Comments

Mobile Web based AR is here! well kinda …  A working demo of web based Augmented Reality using iOS 11 s web RTC enabled browsers.

AR.js is a solution for efficiently doing augmented reality on the web, available on github. Let’s take a detailed look at what it is:

  • Very Fast: It runs efficiently even on mobile phones
  • Web-based: It is a pure web solution, so no installation required. Full javascript based on three.js + jsartoolkit5
  • Open Source: It is completely open source and free of charge
  • Standards: It works on any phone with webgl and webrtc

The goal is to make it easy for people to do augmented reality; AR that can be easily experienced on today’s phones and easily designed using web technology. The AR.js project is about enabling those people. So now, anybody with a modern phone can enjoy open-source AR, free of charge, cross-platform and without installation.

ARKit and Surface Measurement

By | Uncategorized | No Comments

ARKit

At WWDC2017 Apple announced to developers they would be able to access features of the new ARKit framework. The framework is designed to facilitate the building of Augmented Reality experiences on iOS devices. It utilizes Apples A9 – A10 chips and will presumably work with increasing sophisticated sensors that are being developed for upcoming iPhone releases.

Digging into some of the documentation around the beta version of ARKit, apple provides documentation on how to place objects with an 3D augmented reality context.

Once we place those objects or planes we can get the x,y,z coordinates of each node we created. By then doing a little bit of vector math, we can compute the distance of the two points on a two demential plane.

The Demo

A video of the demo we built is featured here, ARKit is calibrated in meters so we need to convert that to inches. According to the app, the long side of the rug measured 1.95 meters or 76.77 inches. The actual distance as measured by the tape measure is 77 inches. Being off by less than 1/4 of an inch is shockingly accurate and could potentially be attributed to where I dropped the circles in the scene.

Looking forward to testing out additional ARKit functionality as it evolves.

User Experience Vetting in Advertising and Technology

By | Uncategorized | No Comments

The term UX in its current incarnation was first coined by Don Norman in 1995, in his book “The Design Of Everyday Things”. He referred to “User Experienced Architect”  making that the fist time that term was used a professional title.  In 2007 with the release of the first iPhone and its revolutionary capacitive touchscreen, physical keyboards of other phones obsolete. provided a user experience far superior to that of any other contemporary phone. This inadvertently led to current business focuses on user experience. Today it’s become one of the fastest growing professional  segments within the advertising and technology.  One of the dangers that come with such a rapid adoption is that the vetting process of what makes a good UX professional is not clearly understood.  It needs to be considered that the discipline of user experience design goes far beyond a proficiency in a wireframing software, and the individuals who want to craft the users’ experience also need to have a breath or aesthetic and technical knowledge to truly make informed decisions.

the-disciplines-of-user-experience-design_51029d505f014_w587

Unity + Kinect + Oculus DK2 = Real VR

By | Uncategorized | No Comments

We have had the oculus DK2 for a few months now and been messing around with it a bit in Unity 5. Unity five has added support of the rift and make it pretty simple to take any Unity project and render it stereoscopically with accelerometer control. To enable just go to File > Build Settings > Player Settings and look in the Setting  Box on the right. You will see two check boxes, one for “Stereoscopic Rendering ” and “Virtual Reality Supported” make sure they are selected. Assuming that the DK2 is installed correctly everything should work. For our experiments we were interested in creating a true virtual reality environment. By combining the new Microsoft Kinect Sensor ( unfortunately could only get it to work on PC ) openni, unity and oculus, we were able to create a unity simulation where the skeleton of the character mimics  the skeleton of the user, allowing them to interact kinematically with physically simulated objects. In addition by adding skeletal gesture recognition to ” look at” controls we can create a virtually interactive UI much like the ever popular “Minority Report” interface.

What is Localist ?

By | Uncategorized | No Comments

Localist started as an idea that myself and a few friends had  while sitting in a coffee shop one day. The place was playing Spotify and it was a band we had never heard of. My friend Sean remarked that there should be a way to listen to bands that are playing in your area so that people can be exposed to new music. After some brief research into various apis, limitations and what information was available we decided that it was a good enough idea for each of us to dedicate 10 – 20 hours a week. Over the next few months, using Express and Mongo as the backend and Handlebars with Bootstrap and a bunch of custom CSS on the front end over the period of three months we build a product that aggregated shows from Jambase, collected band and genre information from Echonest, and created Spotify playlists based on all the information collected. A playlist is automatically generated on a weekly basis for each venue, and for each genre, people can also login with their Spotify credentials and create custom Playslists based on venues they follow, genres they like, or distance from where they live. After 3 month and some serious code sprints we were ready to launch and collect some serious venture capital money, when we got the bad news … Spotify decided to create the same service themselves and release it . Unfortunately for us, Spotify never made mention of working on this on their blog, I suppose it’s the inherent risk of creating a service that is based on someone else technology. We have kept the site up, although the auto creation of new events has been disabled to save bandwidth. Take a look

Projection Mapping On 3D Models

By | Unity | No Comments

A little experiment in unity, taking a model of a face, and triggering different morph shapes. The entire thing is then projected onto a 3D low poly model I created and printed out with the Makerbot. The result looks pretty good, just have to remove all of the rotation and tilt from the morph shapes ( movement of the head ruins the effect ) .

Rube Goldburg Snack Machine

By | Uncategorized | No Comments
Allot of work, particularly experiential work, never makes it through the conceptual or prototype stage to see the light of day. Allot of what we do day to day is concept and feasibility studies along with some prototypes as proof of concept. Occasionally when the client has the budget and the vision, the project becomes a reality, much like the drinkable billboard in the Coke Zero campaign. I wanted to share some concept art i did for a recent pitch, because although it did not become a reality, it was a really exciting idea. the concept was a Rube Goldburg inspired Chex Mix machine that would allow users to customize there own check mix and watch as the ingredients combines in a playfull and complex manner.

Installing MongoDB and Node on Raspberry PI

By | Uncategorized | No Comments
There are numerous reasons one would want to run node and mongo a pi; a mini web server, home automation control etc.  With the improved specification of the Raspberry PI2, the argument becomes even more compelling.  The are plenty of blog postings about how to install different versions of node and mongo on the raspberry pi, but I have found one combination to be the the best. Rick Ps Mongo Pi and Node Version 0.10.42. The process is a bit more involved than using apt-get but in the end it’s a proven stable combo that works on the pi ( in my opinion ). Here are the instructions

install all the dependancies you will need to build everything
sudo apt-get update
sudo apt-get install git-core git scons build-essential scons libpcre++-dev libboost-dev libboost-program-options-dev libboost-thread-dev libboost-filesystem-dev
wget http://nodejs.org/dist/v0.10.28/node-v0.10.28.tar.gz
tar xvf node-v0.10.28.tar.gz
cd node-v0.10.28
./configure
make
sudo make install
now grab a cup of coffee because this will take a while. Once that is done you can install mongo

git clone git://github.com/RickP/mongopi.git
cd mongopi
scons
sudo scons –prefix=/opt/mongo install
scons -c

if you though node took a while buckle in, you may want to issue this command before you go out for the day so that it’s done by the time you get back

add mongo to your path directory

PATH=$PATH:/opt/mongo/bin/ export PATH

at this point you have a choice, you can run mongo as root, or create a new user. If you are the only one messing with the pi, just do it as root (pi@raspberripi) if not add a new user

sudo useradd mongodb

sudo mkdir /var/lib/mongodb sudo chown mongodb:mongodb /var/lib/mongodb now set the paths and start mongo

sudo mkdir /etc/mongodb/

sudo sh -c ‘echo “dbpath=/var/lib/mongodb” > /etc/mongodb/mongodb.conf’

cd /etc/init.d sudo wget -O mongodb https://gist.github.com/ni-c/fd4df404bda6e87fb718/raw/36d45897cd943fbd6d071c096eb4b71b37d0fcbb/mongodb.sh

sudo chmod +x mongodb sudo update-rc.d mongodb defaults

sudo service mongodb start

boom, your ready to start developing your node app on your raspberry pi

Vertical Gardens

By | Carpentry, Fabrication, Uncategorized | No Comments

For the local community garden ( of which I am a member ) Horizontal space is severely limited, so I was asked to design and build a series of vertical gardens that would be modular, light weight, low cost and eco friendly. To complicate matters, we are not allowed to set any permanent concrete footing in the garden. The design I came up with created 40′ worth of vertical garden utilizing wooden L shapes that are tied to the ground using 3′ long sections pf rebar. The bottom portion is a large planter that also acts as a counterweight to keep the structure upright, and the top portion can be configured as vertical growing surface, or multi shelf area that can house individual round pots or long rectangular pots. I’ll be uploading the plans and cut list soon incase anyone wants to build one.