Research about MOCAP

These days I have been researching motion capture systems to record animations for my character. As I have an impressive budget of €0 and little space at my disposal, I have looked for options that are as economical as possible.

My idea was to capture animations to continue with the development of my character, among them I need animations of walking, running, crouching, pressing buttons and crafting inventory items. Now I am in a very early phase of development, the prototype, this implies that I want to obtain a playable base to evaluate if my idea is fun to play and I have realized that it is not the best time to complicate the development.

So I have decided that for now I am going to buy animation assets with which to create the prototype of the game.

However, I have seen interesting solutions for motion capture.

There are costumes like Rokoko or Shadow to capture motion directly in Unreal using their respective Live Link plugins. These solutions are very precise and seem to work well but they are also expensive, for example a Rokoko game development suit with face and finger capture costs around $4,000. I cannot reach this budget and I have not been able to test these systems.

Rokoko suit, image from newatlas.com

There are also cloud solutions based on machine learning such as Plask or Deepmotion. You pay for use or a monthly fee, each system has its own rules but they generally work as follows: record a video with the movements you want to capture, then upload the video and the cloud system extracts the movements. I tried with Plask.ai and the result was this:

MOCAP PlaskAI

However, there are limitations in Plask and it does not capture the movement of the fingers (Deepmotion if you allow it by paying an extra).

Then the animation is obtained in an FBX file, which will have to be modified to adjust the movement in a loop, etc. These systems did not convince me because the result had to be worked on a lot and it is not the most interesting thing for me at the moment.

Another system that I liked and will keep an eye on is Pose AI. It is an app for iOS trained as it is done with a machine learning system to detect movements through the camera in real time. It can be directly connected to Unreal Engine with its plugin and allows recording animations with the Take Recorder tool.

MOCAP PoseAI

Although it is no longer in development phases, for now, Pose AI is not ideal for recording movements since it does not have much precision with some movements, such as walking in a crouch. But they continue to improve their reliability and accuracy and it may be interesting in the future.

I have also investigated motion capture using a Kinect. I have not carried out this experiment until the end since Kinect is not very precise, for example it does not capture in detail the movement of the fingers for example.

MOCAP Kinect

The next step is to research facial animation systems to allow me to get an idea of what I will need in the future to develop this game.