Blender → *.gltf → Converter → *.USDz → Composer → *.rcProject → RealityKit/ARKit

🔗 Original link

About RealityKit

RealityKit is an augmented reality application development framework launched by Apple in WWDC2019. It is a 3D engine containing virtual object rendering, animation, physics, audio and other functions. With the scene understanding ability of LiDAR scanner and ARKit, Developers can more easily create high quality augmented reality experiences. In addition to the RealityKit for rendering 3D content, Apple also provides supporting 3D content production tools: Reality Converter, Reality Composer, and USDZ 3D file format are used as intermediaries to connect the content production workflow of augmented Reality applications.

Inheritance relationship of an Entity in RealityKit

All virtual objects in RealityKit inherit from Entity and contain Transform and Synchronization information by default:

  • Transform Component: coordinate information
  • Synchronization Component: Synchronization Component for multiplayer AR experience

Whether the 3D files imported into the project through THE USDZ file format or Reality Composer are stored in RealityKit through the hierarchy of Entity tree nodes.

AnchorEntity

An Anchor Entity is used to represent coordinate information in an AR scenario. It can be either an empty coordinate used to Anchor an object or a Model Entity that implements the HasAnchoring protocol. However, the two more common ways are as a carrier of Raycast or planar test results:

  1. Generate an Anchor from the spatial coordinates returned by Raycast, such as converting the screen click position to AR world coordinates:
let touchLocation = sender.location(in: arView)

guard let raycastResult = arView.raycast(from: touchLocation, allowing: .estimatedPlane, alignment: .any).first else {
    messageLabel.displayMessage("No surface detected, try getting closer.", duration: 2.0)
    return
}

let anchor = AnchorEntity(raycastResult: raycastResult)
anchor.addChild(entity)
arView.scene.addChild(anchor) 
Copy the code

2. Used for plane detection and filtering plane coordinates that meet the area requirements, such as the plane where the game scene is selected:

// Return the plane coordinates of the detected ** level ** and the area is not less than 20x20 cm
let anchor = AnchorEntity(plane:.horizontal, minimumBounds: [0.2.0.2])
anchor.addChild(model)
arView.scene.addAnchor(anchor)

Copy the code

ModelEntity

The virtual objects actually rendered by RealityKit are stored in Model EnitytModelComponent; The Model Entity can also define the collision properties, physical properties, and physical movements of an object.

The visual part of a virtual object consists of a Mesh, which defines the geometry of the object, and a Material, which defines the texture Material. RealityKit supports four basic geometric shapes, which means they can be generated while the program is running (which also means that other complex shapes need to be generated from other modeling tools) :

  • .generateBox

  • .generatePlane

  • .generateSpere

  • .generateText

RealityKit supports four basic materials:

  • SimpleMaterial: Basic material, can reflect the environment, virtual light according to the material properties (metal, rough, etc.);

  • UnlitMaterial: basic material that does not reflect light;

  • VideoMaterial: use VideoMaterial as material, such as virtual player/small TV;

  • OcclusionMaterial: OcclusionMaterial, can cause occlusion effect:

Import from other 3D modeling tools

Since RealityKit currently supports only a small number of basic geometric shapes, more complex virtual scenes still need to be generated with other professional 3D modeling tools. The next step is to use Blender as an example to illustrate the development process from modeling to achieving AR effects.

Once you’ve finished modeling, texturing, animating, etc., in Blender, you can export your 3D files (the most compatible of them all)GlTF 2.0Format) and then passReality ConverterConvert to a USDZ file (you can also use Blender plugin to export directly to.usdFile, the principle is the same).Reality ConverterYou can simply modify the properties of the 3D object such as the material:

Reality ConverterThe exported USDZ file can be directly imported intoReality ComposerIn:

ComposerIn RealityKit, physical properties, spatial coordinates and animation effects of objects can be edited again, but more importantly, the grouping and hierarchical relationship of different objects should be clarified, because in RealityKit, these resources may need to be accessed independently or in groups. The coordinate relationship in the actual AR scenario is also redefined on the basis of the current relative position relationship of resources.

ComposerThe exported.rcprojectFiles can be imported directly intoXcode, and can be found inSwiftIs accessed directly through the file name:

let scene = try! RocketLaunch.loadTower()
Copy the code

Mixed development

Sometimes you need to readjust imported resources in your code, such as replacing masking materials for imported geometry:

let coverGround = rocketScene!.findEntity(named: "LaunchRocket_Ground")?.findEntity(named: "Ground")
var component: ModelComponent = coverGround?.components[ModelComponent].self as! ModelComponent
component.materials = [OcclusionMaterial()]
coverGround?.components.set(component)
Copy the code

RealityKit’s current shortcomings

RealityKit is only two years old, there are still a lot of shortcomings, I hope it can be improved in the later version:

  1. Too few basic shapes
  2. Lack of particle system
  3. Insufficient animation support
  4. USDZ ecological

The first is the lack of support for basic geometry, with RealityKit’s LOGO showing only one Sphere as native support.

The second is the lack of support for particle systems, which means there is no support for cool effects, like SceneKit’s SCNParticleSystem.

Although Composer supports some ppT-level behavior/animation effects, there is still limited animation compatibility for the workflow of exporting from other tools to Converter and back to Composer.

For Apple, which has a strong appeal, it is not surprising to use its own USDZ format, and the support of the whole ecology for USDZ does not need to worry, but its innate limitations may need to be understood in advance, and you have decided whether to meet the needs of your project, for specific reference:GlTF and USDZ.

conclusion

Apple’s layout of AR hardware and software ecology has gradually become clear. RealityKit, as a virtual rendering engine, is more closely coordinated with ARKit. The former is responsible for rendering augmented entities, while the latter is responsible for understanding realistic scenes:

However, from the current design of RealityKit, its current goal is not to cover all AR application scenarios (or it does not have this capability), but to closely cooperate with the application scope of hardware (for example, LiDAR supports laser scanning within 5m). For small scenes, living room AR applications, Provide a set of robust, rapid production solutions.


reference

  1. RealityKit | Apple Developer Document

  2. Getting started with RealityKit

  3. WWDC20: What’s new in ARKit and RealityKit?

WWDC related videos

  1. WWDC19 Session 603 – Introducing RealityKit and Reality Composer

  2. WWDC19 Session 605 – Building Apps with RealityKit

  3. WWDC20 Session 10612 – What’s new in RealityKit