AR Face Tracking Tutorial for iOS: Getting Started | Ray Wenderlich

In this tutorial, you’ll learn how to use AR Face Tracking to track your face using a TrueDepth camera, overlay emoji on your tracked face, and manipulate the emoji based on facial expressions you make.

This is a companion discussion topic for the original entry at

Very useful and easy to follow tutorial.
I was wondering if there is a list of ARFaceGeometry Feature Indices available somewhere? as I would like to play with more than just the nose, mouth and eyes.

Hi @kashif_izhar

I hadn’t found a list anywhere. To figure out the points I used in the tutorial, I added a tap gesture to the view controller and cycled through the points on each tap. I had it add an emoji to the point, so I could see it visually and printed out the index of the point.

It happened an error.
guard let device = sceneView.device else { return nil } =>Value of type ‘ARSCNView?’ has no member ‘device’
How it solve?

Hi @istart

I think that’s happening because you’re trying to run the app in the simulator. The app needs to be run on a device. AR face tracking will not work in the simulator.

Can I run this application in my iPhone 7 ? Or we need iPhone X?

@loman78 You can only run the app on an iPhone X at this point.

Thank you a lot for your great tutorial. It was really helpful for me. By the way, don’t we need to calibrate each person for its accuracy? While I checked another document, there was a calibration function in Unreal ArKit facial tracking example. Do you have any idea? Thanks in advance.

@toojuice Can you please help with this when you get a chance? Thank you - much appreciated! :]

@venister I don’t know of any manual calibration that needs to be done. I assume ARKit is doing some automatic calibration behind the scenes.

Thank you so much for your reply. I also hope that ARKit is doing the job automatically.

This tutorial is more than six months old so questions are no longer supported at the moment for it. Thank you!