In this tutorial, you’ll learn how to use AR Face Tracking to track your face using a TrueDepth camera, overlay emoji on your tracked face, and manipulate the emoji based on facial expressions you make.
Very useful and easy to follow tutorial.
I was wondering if there is a list of ARFaceGeometry Feature Indices available somewhere? as I would like to play with more than just the nose, mouth and eyes.
I hadn’t found a list anywhere. To figure out the points I used in the tutorial, I added a tap gesture to the view controller and cycled through the points on each tap. I had it add an emoji to the point, so I could see it visually and printed out the index of the point.
Hi
It happened an error.
guard let device = sceneView.device else { return nil } =>Value of type ‘ARSCNView?’ has no member ‘device’
How it solve?
I think that’s happening because you’re trying to run the app in the simulator. The app needs to be run on a device. AR face tracking will not work in the simulator.
Thank you a lot for your great tutorial. It was really helpful for me. By the way, don’t we need to calibrate each person for its accuracy? While I checked another document, there was a calibration function in Unreal ArKit facial tracking example. Do you have any idea? Thanks in advance.