A tech demo video by VFX artist Elisha Hung shows how data from the iPhone X’s TrueDepth camera system could be used to animate 3D characters and objects in a CGI movie.
Apple gives developers access to the same face mesh that Animoji uses to animate pigs, rabbits and piles of poo. The data streaming from the iOS API can be transformed into a format that traditional 3D editing software can interpret, as shown by Hung in this demo:
Rather than using expensive motion capture equipment, Hung coded an ARKit app to record his live-updating face mesh as he made various expressions. He then used the depth map to animate a 2D texture of his face.
The end result is surprisingly realistic and mimics the sort of animated face reproduction seen in AAA games titles.
The fidelity of TrueDepth may not be enough to be useful for Hollywood budget movies, but it could open a new avenue for amateur and prosumer videomakers and animators.
It also shows how a third-party iPhone app could be made that recreates the Animoji experience but with human (celebrity?) faces rather than emoji creatures.