![]() The template mesh is then used to generate an asset that can be used for animation. Users can wipe between the 3D head and the source image to check the solve. The MetaHuman plugin then solves the footage to conform a template mesh – a MetaHuman head – to the data. With iPhone footage, Epic recommends also identifying left and right views of the actor’s face to improve the quality of the solve.Ī further reference frame showing the actor’s exposed teeth improves the quality of mouth animations. On footage captured with a professional camera, only a single frame is necessary: a frontal view of the actor with a neutral facial expression. The process begins by ingesting the footage into Unreal Engine, and identifying key reference frames from which the MetaHuman plugin can perform a solve. ![]() Works from one to four reference frames of an actor’s face MetaHuman Animator is intended to streamline that retargeting process: a workflow that Epic Games calls Footage to MetaHuman.Īs with Mesh to MetaHuman, it generates a MetaHuman matching source data: in this case, video footage of an actor, and supporting depth data – about which, more later.Ī ‘teeth pose’: one of the standard reference frames the MetaHuman Animator toolset uses to generate a MetaHuman character matching an actor’s facial proportions from video footage of that actor. MetaHuman characters have facial rigs, so they already supported facial motion capture, but to transfer that motion from video footage of an actor with different facial proportions required manual finessing. ![]() Generates a MetaHuman character matching video footage of an actor The second part, the MetaHuman plugin for Unreal Engine, was released last year, and makes it possible to create MetaHumans matching 3D scans or facial models created in other DCC apps. Users can generate new characters by blending between presets, then adjusting the proportions of the face by hand, and customising readymade hairstyles and clothing. The first part, cloud-based character-creation tool MetaHuman Creator, which enables users to design realistic digital humans by customising preset 3D characters, was released in early access in 2021. ![]() MetaHuman Animator is the latest part of Epic Games’ MetaHuman framework for creating next-gen 3D characters for use in games and real-time applications – and also, increasingly, in offline animation. Part of Epic Games’ framework for creating next-gen digital humans for games and animation The toolset was announced during Epic Games’ State of Unreal keynote at GDC 2023 earlier this year, and is now available in the latest version of its free MetaHuman plugin for Unreal Engine. The system streamlines the process of transferring the facial performance of an actor from footage captured on an iPhone or helmet-mounted camera to a real-time MetaHuman character inside Unreal Engine.Įpic claims that it will “produce the quality of facial animation required by AAA game developers and Hollywood filmmakers, while at the same time being accessible to indie studios and even hobbyists”. Originally posted on 23 March 2023 for the preview, and updated with details of the final release.Įpic Games has released MetaHuman Animator, its much-anticipated facial animation and performance capture toolset for its MetaHuman framework. ![]()
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |