How to create an Face Tracker AR application in xcode?

Good day, everyone. I'll talking about creating an Augmented Reality app from the base up that will watch your face and say your expression depending on what you do in front of the camera.

AR Application for Face tracker

So, presuming you're familiar with Xcode, we will start by building a new project in Xcode by opening it and clicking on New Project.

This will provide us with some pre-built code that will assist us in quickly creating an AR app. There will be a lot of files in your project now, but don't worry, we won't all alter too much.Let's start by creating our storyboard, which we'll do by selecting the main.storyboard file from our project.

You can see an iphone with a blank view in that file; now we want to add one label to reflect the expression on our face; to do so, click on the top right "+" symbol, pick label, and drag and drop on our phone screen.

Now that we've applied a label to our screen, we need to reference it in our viewController.swift file. So now you must pick ViewController.swift and paste the code below into it.

@IBOutlet var label: UILabel! var action = "";

You'll find a viewDidLoad function in that file; this function will be the first function to run when the app is launched for the first time. We will now add the code given below to your viewDidLoad function, as well as ARSCNViewDelegate to your class.

override func viewDidLoad() { super.viewDidLoad() sceneView.delegate = self sceneView.showsStatistics = true guard ARFaceTrackingConfiguration.isSupported else { fatalError("Face tracking is not supported on this device") } }

Now that the configuration is complete, we'll add the viewWillAppear feature, which will run any time the view is viewed, which means every time you turn from one screen to another. The code for viewWillAppear is as follows.

override func viewWillAppear(_ animated: Bool) { super.viewWillAppear(animated) // Create a session configuration let configuration = ARFaceTrackingConfiguration() // Run the view's session sceneView.session.run(configuration) }

Now, when the view fades away, we must perform something, so here is the code for the viewDidDisappear function.

override func viewWillDisappear(_ animated: Bool) { super.viewWillDisappear(animated) // Pause the view's session sceneView.session.pause() }

We'll need to construct a new feature to verify our expressions now. Interestingly, AR kit includes a pre built expressions that we can use, and all we have to do is call action when those expressions occur, so let's get started.

func expression(anchor: ARFaceAnchor) { let mouthSmileLeft = anchor.blendShapes[.mouthSmileLeft] let mouthSmileRight = anchor.blendShapes[.mouthSmileRight] let cheekPuff = anchor.blendShapes[.cheekPuff] let tongueOut = anchor.blendShapes[.tongueOut] let jawLeft = anchor.blendShapes[.jawLeft] let eyeSquintLeft = anchor.blendShapes[.eyeSquintLeft] }

These are some of the AR kit's pre-built expressions. What will happen if anyone uses one of these expressions? Let's add code for that in the same function.

self.action = "Waiting..." if ((mouthSmileLeft?.decimalValue ?? 0.0) + (mouthSmileRight?.decimalValue ?? 0.0)) > 0.9 { self.action = "You are smiling. " } if cheekPuff?.decimalValue ?? 0.0 > 0.1 { self.action = "Your cheeks are puffed. " } if tongueOut?.decimalValue ?? 0.0 > 0.1 { self.action = "Don't stick your tongue out! " } if jawLeft?.decimalValue ?? 0.0 > 0.1 { self.action = "You mouth is weird!" } if eyeSquintLeft?.decimalValue ?? 0.0 > 0.1 { self.action = "Are you flirting?" }

The last thing we need to do is find out how to recognise a face and what to do when one appears in front of a camera, so we'll need to add two renderer functionality before the voice feature we talked about earlier.

func renderer( renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? { let faceMesh = ARSCNFaceGeometry(device: sceneView.device!) let node = SCNNode(geometry: faceMesh) node.geometry?.firstMaterial?.fillMode = .lines return node } func renderer( renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) { if let faceAnchor = anchor as? ARFaceAnchor, let faceGeometry = node.geometry as? ARSCNFaceGeometry { faceGeometry.update(from: faceAnchor.geometry) expression(anchor: faceAnchor) DispatchQueue.main.async { self.label.text = self.action } } }

That's it, your augmented reality picture monitoring software is complete. Here's a guide to the whole code, which you can inspect and play with as you see fit. Github Repo