The world of technology is rapidly advancing, and Epic’s new MetaHuman facial animation tool is poised to transform the time-consuming and labor-intensive animation workflow. This game-changing technology, powered by machine learning, can create strikingly realistic facial animation from a simple video taken on an iPhone in just a few minutes.
The new automatic animation technology expands on Epic’s MetaHuman modeling tool, launched in 2021, which allowed users to manually create highly detailed human models in Unreal Engine 5. Over a million users have since created millions of MetaHumans. The significant challenge, until now, has been the complex task of animating these models, requiring a “4D capture” from specialized hardware and extensive processing time.
The MetaHuman Animator is designed to streamline this process, offering a vastly improved workflow. In a demonstration by Ninja Theory Performance Artist Melina Juergens, a 15-second on-stage performance was captured on an iPhone, processed on a high-end AMD machine in less than a minute, and resulted in a 3D animation practically indistinguishable from the original video. Minute details, from bared teeth to tiny mouth quivers, are all incorporated into the animation, creating a level of realism previously unattainable in such a short time.
Adding to its versatility, the technology can apply these facial animations not just to the performer’s own MetaHuman model but to any model built on the same MetaHuman standard. This means the same motions and words can be expressed through various characters, even highly stylized cartoon ones, minutes after the performance.
Epic’s MetaHuman Animator is not just about creating hyperrealistic animations quickly; it’s also about empowering creativity. As Epic’s press release suggests, if an actor needs to explore a different emotion or direction, they can simply do another take and review the results in about the time it takes to make a cup of coffee.
The technology also builds upon Epic’s Live Link Face iOS app, launched in 2020, which made performance capture using iPhones possible in Unreal Engine. Now, this technology is combined with the great detail of MetaHuman technology, promising even greater fidelity when used with existing vertical stereo head-mounted camera systems.
The impact of this technology is profound. It has the potential to save studios money by making performance capture more efficient. Furthermore, it allows studios to be more experimental and creative. For example, a studio can now animate a MetaHuman character “in just a few clicks,” and the system can even animate a character’s tongue based on the performance’s audio.
The future of animation is here, and it’s powered by Epic’s new MetaHuman Animator. With its ability to create hyperrealistic animations in minutes, this technology is set to redefine the boundaries of what’s possible in animation and beyond. It is not just a tool for creating video games or films but a creative platform that can inspire new ways of storytelling, performance, and artistic expression. As we move forward, we can expect to see the animation industry, and potentially other sectors, significantly transformed by this revolutionary technology.
Picture Credit: Freepik