instructions

ARKit series of articles directory

Ray Wenderlich’s ARKit by Tutorials is a summary of his reading notes

Main features of ARKit:

  • tracking
  • Scene understanding
  • Light is estimated
  • Scene interaction
  • The metric system of measurement
  • Rendering integration

Major limitations of ARKit:

  • Plane detection takes some time
  • Action processing lag: do not move too fast
  • Low light condition
  • Smooth untextured plane
  • Ghost penetration: shading effects cannot be handled properly

Xcode has integrated the AR project creation portal, you can directly create an AR project with a small plane.

Project organization and management

Organization of code in a file

  • Properties: The Properties of the class.
  • Outlets: elements on the XIB.
  • Actions: Actions performed.
  • View Management: Methods such as View lifecycle.
  • Initialization: Initialization method.
  • ARSCNViewDelegate Protocol Extension :AR proxy method

The session control

  • Pausing: arsession.pause () can be used to pause a session.
  • Resuming: ARSession. The run () can restore a suspension of the session.
  • Updating: arsession.run (ARSessionConfig) Can update configuration items.
  • Resetting: arsession.run (_:options:) When the session state changes, it can be handled in the proxy method:
func session(_ session: ARSession, cameraDidChangeTrackingState camera: ARCamera) { switch camera.trackingState { // 1 case .notAvailable: trackingStatus = "Tracking: Not available!" // 2 case .normal: trackingStatus = "Tracking: All good!" // 3 case .limited(let reason): switch reason { case .excessiveMotion: trackingStatus = "Tracking: Limited due to excessive motion!" The sufficientfeatures: trackingStatus = "Tracking: Limited due to insufficient features!" // 3.1 case. InsufficientFeatures: trackingStatus = "Tracking: Limited due to insufficient features!" // 3.2 case. Initializing: trackingStatus = "Tracking: initializing..." // 3.3 case.relocalizing: trackingStatus = "Tracking: relocalizing..." }}}}Copy the code

Debug options

To help with debugging, open the DEBUG option of the AR view:

sceneView.debugOptions = []
Copy the code

The configuration items are as follows:

  • Feature points: Displays the detected Feature points.
  • World Origin: The origin of the World coordinates formed by the intersection of red, green and blue lines.
  • Bounding boxes: Bounding boxes of 3D objects.
  • Wireframe:3D wireframes of objects.

Shaders, materials and textures

The light models (shaders) available in SceneKit are as follows:

Towns Lights
Materials material
We knew the shadow

Physics-based Rendering (PBR)

PBR Lighting model is a newly introduced feature that can make your 3D objects look more realistic.

Let’s focus on the features below:

Environment Map Environment map

An environment map is a cube map, like shybox’s sky box.

Environment maps have two functions: on the one hand, they are similar to Reflection maps, which allow you to see the reflection of the environment image on the high reflectivity surface; Pbr-enabled 3D objects, on the other hand, can provide a realistic lighting environment. As follows:

Diffuse Map

The diffuse map provides the base color, regardless of lighting or other effects.

Normal Map Normal map

A normal line is a vector that is perpendicular to the surface of the geometry. Can be used to calculate effects such as the reflection of light.

The normal map defines the surface normals at the pixel level through the image’s RGB channel. Used for mixing calculations with light to simulate the concave and convex effect of the surface. This simulates the real surface without adding polygons and vertex data.

Height map

Height maps are not part of the PBR lighting model, but they are worth learning. The height map is a black and white image, with white representing the highest point of the object and black representing the lowest point.

Normal Map Online — available at bit.ly/1ELCePX

Occlusion Map is an Occlusion map

Ambient Occlusion Map (OA map). Used to prevent ambient light from illuminating blocked areas, such as cracks in walls. Black and white texture, black means not light, white means can light.

Emission Map Emission map

Defines light and shadow to create a glow effect. For example, earth lights at night (need to turn off the light. Disable environment map in PBR

Self-illumination map

Self-illumination maps are applied after all other effects; Can be used to color, brighten or darken the final effect

Displacement map Displacement map

In the normal map, we can create different heights at the pixel level on the smooth surface, but it’s just a illusion, changing the reflection of the light.

In displacement mapping, we can actually change the surface topography. Gray to white indicates a bulge, and gray to black indicates a depression:

Roughness and Metalness maps

The main feature of PBR is the ability to show visible microscopic details, which are achieved by using metallicity and roughness maps:

  • Metallicity: from the back to the front, gradually increasing.
  • Roughness: increasing from left to right.

Metalness map

Metallicity simulates surface properties such as reflection, refraction, and Fresnel reflection. In this grayscale texture, black represents non-metallic and white represents metallic surface:

Roughness map

Roughness maps simulate the microscopic details of real world surfaces. Producing a bright or dull appearance. In this grayscale texture, black represents the coarsest and white represents the smoothest surface:

Detect plane Surface detection

The Anchor Anchor

An anchor point is a reference point for a 3D object, similar to an anchor in A UIView. The transform applied to the 3D object is also relative to the anchor point.

New anchor points added

When plane detection finds a plane, an anchor point is added, an SCNNode is created, and the proxy method is called:

/ / 1
func renderer(_ renderer: SCNSceneRenderer,
  didAdd node: SCNNode, for anchor: ARAnchor) {
  / / 2
  guard let planeAnchor = anchor as? ARPlaneAnchor else { return }
  / / 3
  DispatchQueue.main.async {
    / / 4
    let planeNode = self.createARPlaneNode(
      planeAnchor: planeAnchor,
      color: UIColor.yellow.withAlphaComponent(0.5))
     / / 5
     node.addChildNode(planeNode)
  }
}
Copy the code

The anchor updates

The proxy method is also called when the anchor is updated:

/ / 1
func renderer(_ renderer: SCNSceneRenderer, 
  didUpdate node: SCNNode, for anchor: ARAnchor) {
  / / 2
  guard let planeAnchor = anchor as? ARPlaneAnchor else { return }
    / / 3
    DispatchQueue.main.async {
      / / 4
      self.updateARPlaneNode(planeNode: node.childNodes[0],
        planeAchor: planeAnchor)
    }
}

Copy the code

Physics Physics

Physics body

The first thing to understand is the concept of physics body:

  • Static body: a static body that can interact with other objects in a physical simulation, but is not affected by its own position. Such as the wall.
  • Dynamic bodies are completely controlled by the physics engine and can interact with other physical bodies in a physics simulation. Such as ball.
  • Kinematic body: a kinematic body that is not controlled by a physics engine but can be moved by code in a physical simulation. Such as the elevator.

Physics body type

And SceneKit’s built-in object shapes:

You can also adjust the physical speed of the entire scene and the number of physical simulation frames:

scene.physicsWorld.speed = 0.05 // The effect is like slow motion

scene.physicsWorld.timeStep = 1.0 / 60.0 // 60 frames per second; If the object is moving too fast, you need to increase the number of frames to improve accuracy, but also increase the CPU load.
Copy the code

The Force Force

The force is represented by the 3 dimensional vector SCNVector3, and the applyForce(_: atPosition: impluse:) method is used to add a force and specify the position. A force can affect both linear and angular velocities. Impluse pulses act only once, such as kicking a ball, while non-pulses act continuously. Position Can influence the effect of action

For more information about physics, see Physics

Light and Shadow

There are two ways to add shadows to objects in AR:

  • Place a flat surface with a light grey texture under the object so that it seems to have a shadow. This is known as “baking” light and shadow into the texture.
  • Put a flat surface under the object and place the flat surfaceReflectivitySet to zero. Add another light source and place the light source onModeInstead ofDeferred“To create shadows in real time.

It also shows how to use code to prevent an object from writing to the color buffer.

func hideARPlaneNodes(a) {
    / / 1
    for anchor in
      (self.sceneView.session.currentFrame? .anchors)! {/ / 2
      if let node = self.sceneView.node(for: anchor) {
        / / 3
        for child in node.childNodes {
          / / 4
          letmaterial = child.geometry? .materials.first! material? .colorBufferWriteMask = [] } } } }Copy the code

For more on this, see my previous SceneKit posts, Lights Lights, Shadows, and official Demo interpretation of Shadows, apple’s official AR Chameleon Demo interpretation

Hit testing

Hit tests can be used to provide interaction with 3D objects

override func touchesBegan(_ touches: Set<UITouch>,
  with event: UIEvent?) {
  DispatchQueue.main.async {
    / / 1
    if lettouchLocation = touches.first? .location(in: self.sceneView) {
      / / 2
      if let hit = self.sceneView.hitTest(touchLocation,
        options: nil).first {
        / / 3
        if hit.node.name == "dice" {
          / / 4
          hit.node.removeFromParentNode()
          self.diceCount += 1
        }
      }
    }
  }
}
Copy the code

The end of the first part of reading notes!