CAPTURING & CLEANING UP MOCAP DATA

 
 

At the beginning of the second week, John Aberdein (Sr. Tutor, School of Design)) and Sean Pickersgill set up 10 more cameras in the studio increasing the total number of cameras to 22. The benefit of having an increased number of cameras allows for more accurate tracking of a performance. This creates better, realistic human movement and allows for the clean up process to be faster. In previous scenes we ran into a common problem with marker occlusion where the cameras lost track of markers meaning that the motion capture skeletal data is distorted. With the extra addition of cameras, hopefully we will not experience marker occlusion as much. (Except for scenes where marker occlusion is unavoidable e.g. scenes where actors must lay or sit down.)

 
With increased accuracy of the camera tracking we took up the challenge of capturing finger movements. We figure the extra movement in the hand can help express more emotion in the performance of scenes.

With increased accuracy of the camera tracking we took up the challenge of capturing finger movements. We figure the extra movement in the hand can help express more emotion in the performance of scenes.

 

During the majority of the second week, Sean and I have been focusing on cleaning up the motion capture data. The cleanup process is to remove anomalies in the animation such as the marker occlusion and jittery movements. We import the raw mocap data into Maya and by using HumanIK we retarget the raw data to the character mesh. This gives us a clear image of how well the mocap animation plays and we can see what animation needs to be fixed and or readjusted. After the data clean up, the animation is then exported out of Maya and imported into Unreal Engine. Next week our challenge is importing all the motion capture data and retargeting it to the character mesh inside Unreal and to edit all the animations into a single clip. This is important as we get a bigger picture of the animation quality in Unreal.

Kristina Huang