TL; DR

  1. Introducing apple’s object photography modeling solution for developers:Realitykit.Photogrammetry;
  2. Command line tool based on macOS 12.0HelloPhotogrammetry, and Capture Sample, a mobile sampling application;
  3. Test the EFFECT of 3D modeling.

WWDC2021 augmented Reality series theme begins with 3D object modeling:

Looking back at developer conferences in recent years, apple’s AR landscape has evolved into an ecosystem of ARKit, RealityKit, and USDZ. ARKit is responsible for understanding real world scenarios and has been upgraded to ARKit 5; RealityKit is responsible for rendering virtual content and has been upgraded to RealityKit 2. USDZ is a pixar-endorsed 3D resource format that is responsible for content ecology in apple’s AR layout.

Apple has already developed its own 3D content editing tool, Reality Composer, as well as Reality Converter, which can convert other 3D formats to USDZ (see “RealityKit developing augmented Reality applications” above). However, for developers who lack 3D modeling and artistic ability, creating vivid 3D models may be the biggest obstacle to developing AR applications. So begins the WWDC2021 AR series, with apple bringing its own Photogrammetry technology.

1. About the Photogrammetry

Photogrammetry is a technique used to generate a 3D model of a real object by taking samples from multiple angles. There are also proven solutions for this technology in the industry, such as commercial software Capturing Reality and open source software MeshRoom:

RealityKit 2 has added the Object Capture module (currently in Beta) to provide a PhotogrammetrySession interface for developers to develop modeling tools based on:

import RealityKit

let inputFolderUrl = URL(fileURLWithPath: "/tmp/MyInputImages/")
let url = URL(fileURLWithPath: "MyObject.usdz")
var request = PhotogrammetrySession.Request.modelFile(url: url, 
                                                      detail: .full)
guard let session = try PhotogrammetrySession(input: inputFolderUrl) else {
    return } 
Copy the code

There are official command-line tools based on this interface:

2. Test the modeling effect

To test the effect of photo modeling, it is necessary to take a full range of photos and samples of the modeled object, using mobile phones, cameras and even drones. The official advice is as follows:

  1. Keep the focus
  2. Cover as many angles as possible
  3. You can flip objects to take different sides
  4. Close to the object
  5. The overlap rate between adjacent samples should be as high as 70%
  6. It is recommended to sample 20-200 images

To assist the phone in sampling, Apple provides an App dedicated to sampling:

When the phone takes a photo and samples, it saves the RGB image, tries to record the current Angle (Gravity vector) of the phone, and records the Depth Data of the image when the device supports it. After completing the sampling, you can find all files of the current sampling Session in the mobile phone file:

AirDrop the sample results to the computer, and then run the command line tool to model:

./HelloPhotogrammetry InputFolder output.usdz
Copy the code

A simple modeling effect was tested as follows (21 samples, no depth map) :

If you want a more complete, detailed model, you may need to capture both RGB and depth information in a well-lit, deep background environment with no clutter. Or with a SLR camera, which has a larger depth of field, you might also get a more detailed effect. But I’m looking forward to seeing what we can do with larger outdoor models (buildings, statues, etc.) using the drone’s surround mode!

3. Related speech topics

3D object modeling is just an appetizer to WWDC21’s AR series of themes, and there are many more related to 3D content generation to follow:

  • AR Quick Look, meet Object Capture
  • Creat 3D workflows with USD