MetaHuman Animator Released by Epic Games

First announced at GDC 2023, today Epic Games have released MetaHuman Animator for it’s life-like human creation tool MetaHuman Creator. MetaHuman Animator enables developers to easy create facial animations for Metahumans, including a new suite of facial animation capture tools that can be used with a single iPhone device.

Details from the Epic Games press release:

What if you could capture an actor’s performance and turn it into high-fidelity facial animation for your MetaHuman in minutes, using just an iPhone and a PC? With MetaHuman Animator, you can! 

First shown at GDC 2023, the latest version of the fast and easy digital human pipeline brings high-fidelity performance capture to MetaHumans. Jump straight in and explore—or learn the ropes, talk to other MetaHuman users, and read the documentation on our new MetaHuman hub on the Epic Developer Community. 

MetaHuman Animator is a new feature set that enables you to capture an actor’s performance using an iPhone or stereo head-mounted camera system (HMC) and apply it as high-fidelity facial animation on any MetaHuman character, without the need for manual intervention. 

Every subtle expression, look, and emotion is accurately captured and faithfully replicated on your digital human. Even better, it’s simple and straightforward to achieve incredible results—anyone can do it. 

If you’re new to performance capture, MetaHuman Animator is a convenient way to bring facial animation to your MetaHumans based on real-world performances. And if you already do performance capture, this new feature set will significantly improve your existing capture workflow, reduce time and effort, and give you more creative control. Just pair MetaHuman Animator with your existing vertical stereo head-mounted camera to achieve even greater visual fidelity.  

High-fidelity facial animation—the easy way

Previously, it would have taken a team of experts months to faithfully recreate every nuance of the actor’s performance on a digital character. Now, MetaHuman Animator does the hard work for you in a fraction of the time—and with far less effort. 

The new feature set uses a 4D solver to combine video and depth data together with a MetaHuman representation of the performer. The animation is produced locally using GPU hardware, with the final animation available in minutes.

That all happens under the hood, though—for you, it’s pretty much a case of pointing the camera at the actor and pressing record. Once captured, MetaHuman Animator accurately reproduces the individuality and nuance of the actor’s performance onto any MetaHuman character. 

What’s more, the animation data is semantically correct, using the appropriate rig controls, and temporally consistent, with smooth control transitions so it’s easy to make artistic adjustments if you want to tweak the animation. 

Want to see what’s possible with MetaHuman Animator when you use high-end equipment? 

Introducing Blue Dot, a short film created by Epic Games’ 3Lateral team in collaboration with local Serbian artists, including renowned actor Radivoje Bukvić, who delivers a monologue based on a poem by Mika Antic. The performance was filmed at Take One studio’s mocap stage with cinematographer Ivan Šijak acting as director of photography.

These nuanced results demonstrate the level of fidelity that artists and filmmakers can expect when using MetaHuman Animator with a stereo head-mounted camera system and traditional filmmaking techniques.

The team was able to achieve this impressive level of animation quality with minimal interventions on top of MetaHuman Animator results.

Facial animation for any MetaHuman

The facial animation you capture using MetaHuman Animator can be applied to any MetaHuman character or any character adopting the new MetaHuman facial description standard in just a few clicks. 

That means you can design your character the way you want, safe in the knowledge that the facial animation applied to it will work. 

To get technical for a minute, that is achievable because Mesh to MetaHuman can now create a MetaHuman Identity from just three frames of video, along with depth data captured using your iPhone or reconstructed using data from your vertical stereo head-mounted camera. 

This personalizes the solver to the actor, enabling MetaHuman Animator to produce animation that works on any MetaHuman character. It can even use the audio to produce convincing tongue animation. 

Use an iPhone for capture

We want to take facial performance capture from something only experts with high-end capture systems can achieve, and turn it into something for all creators.

At its simplest, MetaHuman Animator can be used with just an iPhone (12 or above) and a desktop PC. That’s possible because we’ve updated the Live Link Face iOS app to capture raw video and depth data, which is then ingested directly from the device into Unreal Engine for processing.

You can also use MetaHuman Animator with your existing vertical stereo head-mounted camera system to achieve even greater fidelity. 

Whether you’re using an iPhone or stereo HMC, MetaHuman Animator will improve the speed and ease of use of your capture workflow. This gives you the flexibility to choose the hardware best suited to the requirements of your shoot and the level of visual fidelity you are looking to hit. 

The captured animation data supports timecode, so facial performance animation can easily be aligned with body motion capture and audio to deliver a full character performance.

Perfect for making creative choices on set

MetaHuman Animator is perfectly adapted for creative iteration on set because it enables you to process and transfer facial animation onto any MetaHuman character, fast. 

Need an actor to give you more, dig into a different emotion, or simply explore a new direction? Have them do another take. You’ll be able to review the results in about the time it takes to make a cup of coffee. 

With animation data reviewed right there in Unreal Engine while you’re on the shoot, the quality of the capture can be evaluated well in advance of the final character being animated. 

And because reshoots can take place while the actor is still on stage, you can get the best take in the can there and then, instead of having to absorb the cost and time needed to bring everyone back at a later date.

New Mesh to MetaHuman workflow for custom characters

This release isn’t just about MetaHuman Animator, however. We’ve also expanded Mesh to MetaHuman so you can now directly set the template mesh point positions. 

Mesh to MetaHuman performs a fit that enables it to use any topology, but this necessarily approximates the volume of your input mesh. With this release, you can set the template mesh and, provided it strictly adheres to the MetaHuman topology, you will get exactly that mesh rigged—not an approximation. 

In tandem with the DNA Calib ability to set the neutral pose for mesh and joints, this empowers experts to quickly zero in on custom characters.  

Want to learn more? You can now dive into the documentation for this and all other aspects of the MetaHuman framework on our new MetaHuman hub. Located on the Epic Developer Community—the one-stop shop to learn about Epic tools and exchange information with others—it also hosts forums where you can showcase your work or ask questions, and a tutorial section that contains both Epic and user-made tutorial content.

Key Links

Announcement Blog

MetaHuman Creator Browser App

MetaHuman Plugin on Unreal Engine Marketplace

Live Link App on Apple App Store

MetaHuman Learning Hub

You can learn more about the MetaHuman Animator release and see me creator a monster in MetaHuman Creator in the video below.

Scroll to Top