Located on the Epic Developer Community-the one-stop shop to learn about Epic tools and exchange information with others-it also hosts forums where you can showcase your work or ask questions, and a tutorial section that contains both Epic and user-made tutorial content. Want to learn more? You can now dive into the documentation for this and all other aspects of the MetaHuman framework on our new MetaHuman hub. In tandem with the DNA Calib ability to set the neutral pose for mesh and joints, this empowers experts to quickly zero in on custom characters. With this release, you can set the template mesh and, provided it strictly adheres to the MetaHuman topology, you will get exactly that mesh rigged-not an approximation. Mesh to MetaHuman performs a fit that enables it to use any topology, but this necessarily approximates the volume of your input mesh. We've also expanded Mesh to MetaHuman so you can now directly set the template mesh point positions. This release isn’t just about MetaHuman Animator, however. New Mesh to MetaHuman workflow for custom characters What’s more, the team was able to achieve this impressive level of animation quality with minimal interventions on top of MetaHuman Animator results. These nuanced results demonstrate the level of fidelity that artists and filmmakers can expect when using MetaHuman Animator with a stereo head-mounted camera system and traditional filmmaking techniques. The performance was filmed at Take One studio's mocap stage with cinematographer Ivan Šijak acting as director of photography. Introducing Blue Dot, a short film created by Epic Games' 3Lateral team in collaboration with local Serbian artists, including renowned actor Radivoje Bukvić, who delivers a monologue based on a poem by Mika Antic. Want to see what’s possible with MetaHuman Animator when you use high-end equipment? What’s more, the animation data is semantically correct, using the appropriate rig controls, and temporally consistent, with smooth control transitions so it’s easy to make artistic adjustments if you want to tweak the animation. Once captured, MetaHuman Animator accurately reproduces the individuality and nuance of the actor’s performance onto any MetaHuman character. That all happens under the hood, though-for you, it’s pretty much a case of pointing the camera at the actor and pressing record. The animation is produced locally using GPU hardware, with the final animation available in minutes. The new feature set uses a 4D solver to combine video and depth data together with a MetaHuman representation of the performer. Now, MetaHuman Animator does the hard work for you in a fraction of the time-and with far less effort. Previously, it would have taken a team of experts months to faithfully recreate every nuance of the actor’s performance on a digital character. High-fidelity facial animation-the easy way Just pair MetaHuman Animator with your existing vertical stereo head-mounted camera to achieve even greater visual fidelity. If you're new to performance capture, MetaHuman Animator is a convenient way to bring facial animation to your MetaHumans based on real-world performances.Īnd if you already do performance capture, this new feature set will significantly improve your existing capture workflow, reduce time and effort, and give you more creative control. Even better, it’s simple and straightforward to achieve incredible results-anyone can do it. Every subtle expression, look, and emotion is accurately captured and faithfully replicated on your digital human.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |