We will use Rokoko to post-process effects on our data to get a much cleaner export to iClone. Once done with a motion capture session, we made a few tries so we can pick the best one for the job. Once we are done, we are ready to animate. There is a step-by-step tutorial on the official Reallusion youtube channel so be sure to check that out. The next thing on the list is to set up our MetaHuman blueprint so it can receive real-time animation data. Just copy and paste dummies in your iClone content folders and copy and paste Livelink folders into your Plugin folders inside the Unreal project. Reallusion provides you with the LiveLink plugin and Dummy models which are retargeted MetaHuman bodies. Prior to jumping into iClone, be sure to download and install the necessary plugins. It’s good to have some plans for what you are going to capture and how the environment is going to interact with it. We made a simple scene with a few things in the background, and our main props which include a chair, a table, a few lights, and a laptop. Yaki character needs to look up because he is missing the mark of He-Man’s face, then with iClone you can easily correct this by editing any specific motion track for face, eyes, hands, and fingers.įinally, Cory adds everything into Unreal Engine to start setting up his shots by creating a level sequence, synchronizing everything with his audio wave forms, positions and animations, cameras, lighting and any special effects! Done!īefore we jump into motion capture we should do some block out of our scene or have already the final scene where we’re going to place our animation. IClone is especially useful when you really want that high quality feel in your performances where you want to do small editing like on hand gestures in specific timeframes.įor example: you can easily adjust any facial expression or lipsync on Skeletor (if you are not using a face mocap device), or if the Mr. Now there is a reason why iClone is used instead of going straight into Unreal Engine, because iClone allows you to correct any offset motions that are much easier to fix in iClone rather than Unreal. When the custom animations are ready, they are brought into iClone 8 with the custom rigged characters. Amazingly he does this all by himself, which gives him a mental picture of each character’s gestures, nuances and performances. Once you know your character is ready, Cory starts production by doing all the voice recordings, and animations for each character. Inside Character Creator you can even do specific characterizations for non-standard characters which will allow you to work with any specific motion you design through your mocap suit. Then Cory used Character Creator 4 to import his new AccuRIGGED character to test and add custom dance animations he captured with his mocap suit, including individual finger tests which AccuRIG definitely rigs for. This process takes about 15 minutes and is perfect for working with all kinds of static poses and well known character rigs, to later export as FBX, which you can even test and correct for mesh deformations caused by motion stretching. Next Cory used his new secret weapon - ActorCore’s free AccuRIGtool to import the scanned FBX character and automatically rig it for full body and finger animations. But Cory was able to enhance that process by purchasing a portable scanner known as a Revopoint MINI, which he used to scan his Xsens character for a test, prior to scanning Skeletor. His original step was to take many pictures of his characters with an iPhone in a process known as photogrammetry.
0 Comments
Leave a Reply. |