ARKit Remote: Now With Face Tracking!
Staff Writer By: Jimmy Alamparambil, Unity Blog
At Unity, we strive to make your job as a developer easier and more efficient, and since the release of Apple's ARKit in mid-2017, we have been working hard to streamline AR development for ARKit with our ARKit plugin and the ARKit Remote. The ARKit remote allows developers to iterate on ARKit experiences right inside the Unity Editor, without building to the device each time. Today we are happy to announce that you can now access ARKit Remote functionality for Face Tracking on iPhone X by downloading or updating the ARKit plugin for Unity.
Build ARKit Remote
To use ARKit Remote for Face Tracking, you will first need to build the ARKit Remote scene as an app to your iPhoneX. You will need an iPhoneX since it is the only device right now to feature the front facing TrueDepth camera, which is needed for Face Tracking. Follow these steps for building the app to the device:
2. Open the "Assets/UnityARKitPlugin/ARKitRemote/UnityARKitRemote" scene.
3. Select the "Assets/UnityARKitPlugin/Resources/UnityARKitPlugin/ARKitSettings" file and activate the "ARKit Uses Facetracking" check box.
4. Select PlayerSettings (in the menu: Edit/Project Settings/Player) and make sure you have some text in the entry "Camera Usage Description."
5. Select BuildSettings (in menu File/Build Settings...) and check the Development Build checkbox.
6. Now build this scene to your iPhone X as you would normally build an app via XCode.
Here's a video of the steps needed for building the ARKit Remote.
Connect Editor to ARKit Remote
The steps in the previous section need only be done once to build ARKit Remote to your device. The following steps can be used over and over again to iterate on the ARKit Face Tracking in the editor:
1. Connect the iPhone X to your Mac development machine via USB.
2. Start up the ARKit Remote app on the device. You should see a "Waiting for connection.." screen.
3. In the Unity Editor, connect to your iPhone X by going to your Console Window and selecting the iPhone X connected via USB.
4. Load up one of the FaceTracking examples in the project e.g. "Assets/UnityARKitPlugin/Examples/FaceTracking/FaceAnchorScene" and press Play in the Editor.
5. You should see a green screen with a button on top that says "Start ARKit Face Tracking Session." Press that button and you should see your front camera video feed in your Editor "Game" window. If your face is in the view it will be sending ARKit Face Tracking data from the device to the Editor as well.
Here is a video that demonstrates the connection steps:
Play with ARKit Face Tracking Data
Once you have connected your ARKit Face Tracking scene to ARKit Remote, all the Face Tracking data (face anchor, face mesh, blendshapes, directional lighting) is sent from device to Editor. You can then manipulate that data in the Editor to affect the scene immediately. Here are a couple of videos to demonstrate this:
New, Streamlined ARKit Remote Workflow!
As part of adding Face Tracking functionality to the ARKit Remote, we also made it much easier to work with ARKit Remote without altering your original ARKit scene in the Unity Editor. Previously, you had to add a GameObject that connects from your scene to the ARKit Remote. Now, we check if you are trying to initialize an ARKit configuration from the Editor and it automatically adds the RemoteConnection GameObject to your scene at runtime.
Have fun playing around with ARKit Face Tracking in the Unity Editor!