LiveLink Facecap to Maya

So, you have livelink providing you with data. It's in Unreal. It's good looking. But it's also not quite perfect, so let's touch it up.

Personally, I prefer the animation tools of Maya to Unreal, so I'd prefer to clean up the mocap-lite data there. Unfortunately, getting the data out is not the easiest thing, and does require a bit of elbow-grease.

0:00
/
MacBeth Test - No Mocap Edit

So far, I've found that recording to a new animation to be the best way to handle animations. It's done by going to the metahuman animation blueprint in the engine, turning on livelink, and hitting record. I've found it's a bit nicer than Take Recorder for exporting as FBX, but we do lose audio. All in all, that's not a deal breaker for me, you can record locally on the phone for synchronisation anyway.

Once the animation is recorded, you can set it to play back like any other animation in Sequencer. That's how the video above was created.

Tip for LiveLink: Make sure you are well-lit. Pointing lights at my face definitely helped the stability of the recording.

As with any asset in Unreal, you can export it back out of the engine. When you do this with one of these assets, you get the ARKit blend controllers as animation channels under the root node of the skeleton.

To be clear, it's far from ideal that they don't automatically hook up to the Bridge-To-Maya files and there is no script for that yet. That said, you can take each channel and paste the keys onto the control rig in Maya. It's a very manual process, and one or two of the controls don't match up perfectly, but it does work.

My next step was to create a new, override animation layer, set the weight to 0, and start keying the controls. This transfers the position from the dense base animation layer to a cleaner layer. Remember, LiveLink data is recorded at 30fps by default, so it has 30 keys for every control every second. That's needlessly large and unwieldy, so we can non-destructively rekey to something more manageable. This technique is discussed in The Best Animation Tricks of the Trade (for 2016).

Because we're transferring the keys onto the controls, we can go back and manually animate the finishing touches later. We can also attach the animation to a different metahuman, as seen below.

0:00
/