instructions

For better results, read this article in conjunction with the readme.md file in the annotated version of the code address.

ARKit series of articles directory

Official Code address

Apple conference in 2017, apple demonstrated ARKit a Demo, called InteractiveContentwithARKit, right, is the chameleon!!!!!

This example demonstrates the following concepts:

  • How to place an interactive CG object (a chameleon) and interact with it.
  • How to trigger and control animations of objects based on the user’s movement and approach.
  • How to use shader to adjust the appearance of virtual objects.

Since the whole project is very simple, there are only a few main files:

Where Extensions.swift is just a simple utility class. There are also only a few click events and render loops in viewController.swift. The specific functions and when to call them are also described in the readme.md file.

We’ll focus on a few interesting method implementations in Chameleon.swift.

PreloadAnimations () Animation loading and playing

Animation playback is very simple, just find the node and add animation:

// anim for SCNAnimation
contentRootNode.childNodes[0].addAnimation(anim, forKey: anim.keyPath)
Copy the code

So where did animation come from? It is loaded from the.dae file by name. Dae file is a scene file, i.e. SCNScene, and its rootNode is traversed to find the corresponding animationPlayer according to the animationKey.

static func fromFile(named name: String, inDirectory: String ) -> SCNAnimation? {
    let animScene = SCNScene(named: name, inDirectory: inDirectory)
    var animation: SCNAnimation?
    // Iterate over the child nodesanimScene? .rootNode.enumerateChildNodes({ (child, stop)in
        if! child.animationKeys.isEmpty {// Find the corresponding player according to the key
            let player = child.animationPlayer(forKey: child.animationKeys[0]) animation = player? .animation stop.initialize(to:true) } }) animation? .keyPath = namereturn animation

}

Copy the code

Is that it?? It’s not that simple. In the turn animation,SCNAnimation just gives the chameleon the turn action, but the node doesn’t actually turn, so in the playTurnAnimation(_ animation: SCNAnimation) also uses SCNTransaction to make the transform of this node actually change.

RelativePositionToHead (pointOfViewPosition: simD_float3) Finds the Angle between them

In this method, the Angle between the head and the camera is interesting:

// Convert the camera viewpoint coordinates from the world coordinate to the head coordinate
let cameraPosLocal = head.simdConvertPosition(pointOfViewPosition, from: nil)
// The projection of the camera viewpoint coordinates on the plane of 'head' (y value is equal to the y value of 'head')
let cameraPosLocalComponentX = simd_float3(cameraPosLocal.x, head.position.y, cameraPosLocal.z)
let dist = simd_length(cameraPosLocal - head.simdPosition)

// The inverse trigonometric function is used to calculate the angles between them and convert them into Angle system
letxAngle = acos(simd_dot(simd_normalize(head! .simdPosition), simd_normalize(cameraPosLocalComponentX))) *180 / Float.pi
let yAngle = asin(cameraPosLocal.y / dist) * 180 / Float.pi

let selfToUserDistance = simd_length(pointOfViewPosition - jaw.simdWorldPosition)

// Then, according to the included Angle and distance, determine the animation to play in other functions
Copy the code

OpenCloseMouthAndShootTongue trigger other events () in the process of animation

Originally it was a simple CAKeyframe rotating animation, opening the mouth.

// Rotate about the X-axis
let animation = CAKeyframeAnimation(keyPath: "eulerAngles.x")
animation.duration = 4.0
animation.keyTimes = [0.0.0.05.0.75.1.0]
animation.values = [0, -0.4, -0.4.0]
animation.timingFunctions = [
CAMediaTimingFunction(name: kCAMediaTimingFunctionEaseOut),
CAMediaTimingFunction(name: kCAMediaTimingFunctionLinear),
CAMediaTimingFunction(name: kCAMediaTimingFunctionEaseInEaseOut)
	]
// What is this? Is a closure callback that is triggered depending on how far the animation has traveled
animation.animationEvents = [startShootEvent, endShootEvent, mouthClosedEvent]

mouthAnimationState = .mouthMoving
// Add the animation and play it
jaw.addAnimation(animation, forKey: "open close mouth")
Copy the code

But you need to fire the tongue animation after opening, so animationEvents have been added to fire different callbacks at different stages of keyTime

let startShootEvent = SCNAnimationEvent(keyTime: 0.07) {(_._._) in
	self.mouthAnimationState = .shootingTongue
}
let endShootEvent = SCNAnimationEvent(keyTime: 0.65) {(_._._) in
	self.mouthAnimationState = .pullingBackTongue
}
let mouthClosedEvent = SCNAnimationEvent(keyTime: 0.99) {(_._._) in
	self.mouthAnimationState = .mouthClosed
	self.readyToShootCounter = -100
}
Copy the code

When the self. MouthAnimationState changed to. After shootingTongue reactToDidApplyConstraints (in sceneView: The ARSCNView method is called on every frame, and then we call updateTongue(forTarget target: Simd_float3), then start to move the tongueTip node:

currentTonguePosition = startPos + intermediatePos
// convert 'currentTonguePosition' from the world coordinate system to the animation position of the tongueTip's parent node, and assign the converted position to 'tongueTip'tongueTip.simdPosition = tongueTip.parent! .presentation.simdConvertPosition(currentTonguePosition, from:nil)
Copy the code

Constraint and universal lock in setupConstraints()

The eye added the SCNLookAtConstraint to open the universal lock in order to prevent the Euler Angle from causing a deadlock

SCNTransformConstraint was also added to limit the X-axis Euler Angle to -20~+20 degrees, the left eye Y-axis euler Angle to 5~150 degrees, and the right eye -5~-150 degrees

// Set eye movement constraints
let leftEyeLookAtConstraint = SCNLookAtConstraint(target: focusOfLeftEye)
leftEyeLookAtConstraint.isGimbalLockEnabled = true

let rightEyeLookAtConstraint = SCNLookAtConstraint(target: focusOfRightEye)
rightEyeLookAtConstraint.isGimbalLockEnabled = true

let eyeRotationConstraint = SCNTransformConstraint(inWorldSpace: false) { (node, transform) -> SCNMatrix4 in
    var eulerX = node.presentation.eulerAngles.x
    var eulerY = node.presentation.eulerAngles.y
    if eulerX < self.rad(-20) { eulerX = self.rad(-20)}if eulerX > self.rad(20) { eulerX = self.rad(20)}if node.name == "Eye_R" {
        if eulerY < self.rad(-150) { eulerY = self.rad(-150)}if eulerY > self.rad(-5) { eulerY = self.rad(-5)}}else {
        if eulerY > self.rad(150) { eulerY = self.rad(150)}if eulerY < self.rad(5) { eulerY = self.rad(5)}}let tempNode = SCNNode()
    tempNode.transform = node.presentation.transform
    tempNode.eulerAngles = SCNVector3(eulerX, eulerY, 0)
    returntempNode.transform } leftEye? .constraints = [leftEyeLookAtConstraint, eyeRotationConstraint] rightEye? .constraints = [rightEyeLookAtConstraint, eyeRotationConstraint]Copy the code

Use of the setupShader() shader

Read the shader String, and then through shaderModifiers loaded into, through the dictionary to specify type, SCNShaderModifierEntryPoint types of geometry, surface, lightingModel, fragments. Here we specify the Surface type.

How w to pass parameters to a shader? Direct use of KVC, simple and crude, but good use…

skin.shaderModifiers = [SCNShaderModifierEntryPoint.surface: shader]

skin.setValue(Double(0), forKey: "blendFactor")
skin.setValue(NSValue(scnVector3: SCNVector3Zero), forKey: "skinColorFromEnvironment")
		
let sparseTexture = SCNMaterialProperty(contents: UIImage(named: "art.scnassets/textures/chameleon_DIFFUSE_BASE.png")! skin.setValue(sparseTexture, forKey:"sparseTexture")
Copy the code

SceneView: ARSCNView) and activateCamouflage(_ Activate: Bool) activate/update the shader by KVC.

The Metal of the shader

Finally, let’s take a quick look at Metal’s shader.

#pragmaArguments for external arguments
float blendFactor;
texture2d sparseTexture;
float3 skinColorFromEnvironment;

#pragma body
// Texture and sampler declarations
Normalized coordinate: Normalized, addressing mode: Clamp_TO_zero, filtering mode: Linear
constexpr sampler sparseSampler(coord::normalized, address::clamp_to_zero, filter::linear);
// The result of sampling is to get the color sampled in the external texture sparseTexture
float4 texelToMerge = sparseTexture.sample(sparseSampler, _surface.diffuseTexcoord);
BlendFactor =0 (blendFactor=0); After masquerade is enabled, the incoming blendFactor=1, that is, the incoming texture is used.
_surface.diffuse = mix(_surface.diffuse, texelToMerge, blendFactor);

float alpha = _surface.diffuse.a;
// Change the diffuse layer RGB value of _suface based on the environment color passed in from the outside.
_surface.diffuse.rgb += skinColorFromEnvironment * (1.0 - alpha);
_surface.diffuse.a = 1.0;
Copy the code

Clamp_to_zero in addressing mode is similar to clamp-to-boarder in OpenGL. When sampled outside the boundary, the texture will always have a color value (0.0, 0.0, 0.0, 1.0) if it does not contain an alpha component, otherwise, Metal’s shader is based on c++11(Metal2 is already c++ 14), adding its own syntax and restrictions. For more detailed syntax, see the Metal Shading Language Guide.

Shadow tips

The chameleon Demo uses an ambient light map. There is no real light source so there is no real shadow. Instead, a little trick is used to create a “false shadow” by placing a flat surface with a light grey texture on the foot of the four feet so that it looks like there are shadows. This is known as “baking” light and shadow into the texture.

// The chameleon uses an environment map, so disable built-in lighting
// Disable the built-in lighting
sceneView.automaticallyUpdatesLighting = false
Copy the code
// Load the environment map
// Load the lighting environment map
self.lightingEnvironment.contents = UIImage(named: "art.scnassets/environment_blur.exr")!
Copy the code

As apple explains in WWDC17, there’s another way to create real time, real shadows.

  1. Place a flat surface on the square of the object to display the shadow
  2. Select the plane, in the Material inspector, cancelwrite to colorSo that the plane will not be written to the color buffer, but the shadows will disappear at the same time.
  3. To redisplay the shadows, you need to change the light configuration. Select the light node and go to the Light Inspector
  4. Change the pattern toDeferred“And the shadow is re-created.