This week we looked into combining the MoCap data we collected in week 1 with Human IK rigs on Maya.
In a new Maya scene I imported the first take of the MoCap data which will bring in the skeleton with the movements and actions of the animation recorded in the studio with the mobcap suit that the person wore
In the time editor I have imported the animation of the motion capture as a clip by selecting the hips of the rig. With the time editor you can easily import different motion capture animation clips and combine them together.
Before adding the second clip I had to identify the skeleton in maya as an human IK to attach the rig to the character.
I have created a character definition where I have put every correspondent joint in each part of the rig of the character in the viewport.
Once the pairing is done and each joint turned to green, I have added one rig. If I want it to be bound to the skeleton you need to put the rig we matched and identified before in Maya before as a source for the character in order for it to follow the animation.
I after went back to the time editor to further edit the animation: you can trim the clip, make it move faster or slower just like in a video editor.
You can even add a second animation clip to merge them together as long as they are applied to the same MoCap skeleton we have imported.
In order to match the two animations you can align the animations form the clips using the “matching options” in the Relocator menu. By selecting the initial mobcap skeleton, I used the location of the right foot to be matched with the location of the right foot of the second clip matching the last moment of the location of the first clip to the first moment of the location of the second clip.
This is the outcome:
By grabbing on elf the clips and dragging it on top of the other clip it would merge them seemingly making the transition smoother.
After I have created a new character definition with the same character but with no controls doing the same thing I did with the mobcap skeleton but applying it to a new rig.
the definition of the character works better when there is a typos, so I have aligned the arms of the character.
I after matched the rig of the character rig just like I did for the mobcap rig. I also created a control rig for the IK and if I added the character 1 rig as a source the mobcap animation is going to be applied to this new rig.
I after baked the motion caption onto the rig of the character so that I am not bound anymore to the character 1 skeleton and I was able after to edit the animation from the mobcap in a different layer to adjust it. Baking is a term that is used widely in the 3D community. It is a term that can be applied to many different processes. What it generally means is, freezing and recording the result of a computer process. It is used in everything from animations, to simulations, to texturing 3d models and much more. Baking a simulation allows you to generate a single animation curve for an object whose actions are being provided by simulation rather than by keys and animation curves (keysets).
On the other hand, in order to identify a rig that is more complicated the process to add the mobcap animation onto it (adding finger animation for instance or facial expressions). In particular this rig is not identified in Maya as Human IK. As for the previous rig I first identified it positioning it as a typos aligning the arms.
I after created a new character definition selecting the rig and identifying the skeleton.
I then created a custom rig mapping selecting the controls and applying after the animation onto it as well, mapping for your character to source animation streaming from MotionBuilder or from a local HumanIK character within Maya