• Using iPhone X With Maya For Quick And Cheap Facial Capture
  • Originally written by IAN HAMILTON
  • The Nuggets translation Project
  • Permanent link to this article: github.com/xitu/gold-m…
  • Translator: ALVINYEH
  • Proofreader: luochen1992, melon8

Could the iPhone X be a fast, cheap and easy face capture system? About a month ago, Corey Strasberg of Kite & Lightning received an iPhone X from Apple. Within a day, he was testing the software with his TrueDepth camera and ARKit. He wanted to see if it could be used for their game and movie content.

Kite & Lightning was an early innovator in the Oculus VR development kit and has used some compelling character capture technology to build groundbreaking experiences like Senza Peso. Right now, they’re building the Royal Battle of Babylon. The game revolves around these “Beby” characters with huge attitudes. He wondered if it could be done faster and cheaper to give these characters a larger personality by using iPhone X face capture, which he spent some time on on weekends.

“I think one of the big conclusions I’ve come to at this point is that the data captured on iPhone X is very subtle, stable and not overly smooth,” Strasberg wrote in an email. “It’s actually able to pick up very subtle movements, even tiny twitches, and it’s clean enough (noise-free) to use directly on your phone, depending on your criteria of course.”

He sees it as a relatively inexpensive way to capture faces. The system is also mobile, making it easier to set up and deploy. Apple bought a company called Faceshift, which appears to be powering much of the feature. While Strasberg points out that Faceshift’s solution has some other cool features, he’s been able to extract enough expressiveness from Apple’s released iPhone X that it may still be useful for vr production.

  • YouTube video link: https://youtu.be/w047Dbo-fGQ

Capture process

Here’s Strasberg’s overview of how to capture the iPhone X’s face capture data and use it to activate an animated character’s expression in Maya:

  • Using Apple ARKit and Unity, I imported a Bebylon character under development and hooked up its facial expression with a mix of shapes and ARKit’s output of face capture data. This allows me to animate the baby’s face based on my own expression.
  • I need to capture this expression data to import into Maya. I added a recording function to pass facial expression data into a text file. Then save it to your local iPhone. Each captured expression is stored in a separate text file from start to finish and can be named or renamed in the capture application.
  • I copied text files from the iPhone X to my desktop via USB.
  • To import into Maya, the captured data needs to be reformatted, so I wrote a simple desktop application to do this. It takes the selected text files and converts them to Maya. Anim files.
  • I imported.anim files into Maya and Voila, and your character mimics what you see on your iPhone during capture.

According to Strasberg, he saw several small bugs in the data that he thought might be caused by his code. Also, although the capture occurs at 60 frames per second, the process is currently presented at 30 frames per second, so you can see some loss in quality. According to Strasberg, this is most pronounced in the “horse lips” section.

“This system is really the beauty of it is very fast and easy to capture (on your phone), and then imported into the Maya or game engine,” Stella’s wrote: “at any time without involving the real process, the data looks very clean, and can be directly through the mobile phone to use without modifying the data.”

The next step

Strasberg wants to be able to attach the iPhone X to the helmet and use the Xsens suit for full-body movement while also capturing faces.

“I am very confident that by tweaking the shape fusion shaper’s parametric sculpting, and adding the appropriate wrinkle mapping, I can deform the skin while animating the face to significantly improve Beby’s character.” “Similarly, using the captured data to drive the secondary blend morphing, the expression will feel more alive and vivid,” Strasberg wrote.


The Nuggets Translation Project is a community that translates quality Internet technical articles from English sharing articles on nuggets. The content covers Android, iOS, front-end, back-end, blockchain, products, design, artificial intelligence and other fields. If you want to see more high-quality translation, please continue to pay attention to the Translation plan of Digging Gold, the official Weibo, Zhihu column.