top of page

Man V Metahuman

This was a project I started late in 2022, I'd recently completed a film studies module which made me more familiar with set lights and operations and I'd bought myself a Rokoko Smartsuit, so I thought I'd set up a greenscreen in my lounge room and shoot something. I wanted to step up from my previous fight with myself so I wrote a bit of a lead in with some help from my colleague Nat Texler and I designed the entire fight around things I wanted to try and do.

Every beat of the fight was a question of if I could make it work or look good, can I hit a CG replaced box? Can I land my mocap character on that CG box? Can I get my metahuman character to throw me? Some of these scenarios work, some of them don't, but it was an enlightening experience. The mocap was recorded before I'd picked up iClone so all the cleanup is done in Unreal on control rig so it's messy and eventually I honestly gave up on getting it any cleaner because it was so arduous.

It was also my first time working with sugar glass. I've been doing stunt acting training for quite a while but it was something I'd just not run into yet. So I worked from a recipe I found online and recruited my friend Sam to stand in as a real world extension of my metahuman character to break a glass over my head. Making them actually proved to be quite tricky and I needed to make a silicone mold of a shot glass and pour the molten sugar mixture into that to get something that was passable on camera.

Since all of the shooting is just on a tripod the focus is just terrible in a number of places. I think the number one thing I needed was a DP and someone to help manage the shooting itself because so many of the issues stem back to issues with filming the live action portion. I did what I could in the edit but there's only so much post can fix. I had originally planned to use my Ronin and iPhone tracking to get a little movement - which may have also fixed my focus problem - but I was having trouble getting my live comp capture into Unreal working (hooray for $20 usb capture cards) so I thought it safer not to do that if I couldn't see the output.

Trying to match everything up in Unreal also proved to be a frustrating nightmare. Using a composite straight into sequencer would crash the engine constantly and instead using a composite plane meant having to line it up on the camera and I got so many weird bugs. As a result a bunch of the composite doesn't really line up very well and the perspective is wrong.

In order to be able to layer the metahuman character both behind and in front of the live action one, I had to render him out in his own layer. I had trouble doing this with stencil layers in Unreal, the alpha was exporting distorted, and instead opted to kind of fake a greenscreen in engine. This was easy to do, but has the side effect of making him look out of place and comped in as well in a number of shots.

The background characters are a combination of Metahuman presets in clothing I either made mostly through just Substance Painter and that I bought on the Marketplace and some other characters I bought on the marketplace.

The advantage to the metahumans is that I was able to give them face animation, but I needed to work out a really weird workflow to do that as recording face animation is really poorly documented. There's a lot on capturing it, but very little on recording it. I ended up doing a capture in another scene then taking that subsequence and saving it to an animation I could apply to face rigs in sequencer (thanks to @Feeding_Wolves on Youtube as the only person who seemed to have any resources on doing that when I looked).

One thing I am pretty proud of is sound, given I'd never done that before either. I really mismanaged my recordings so I ended up having to use the in camera sound for the live action character and re-record the metahuman character just on my blue yeti, but once I got it all into Reaper and placed markers at key timecodes, putting the effects and music in proved pretty easy. I've since noticed not all the sound lines up 100% so I know I need to be more diligent about that in future. The music was initially all written in Reaper but I ended up recording the guitar part as it sounded much better than a midi guitar.

Overall it turned out better than I'd actually expected it would, but exactly where I had the problems and where I didn't weren't all where I'd originally anticipated, it was pretty full of surprises and there's a lot of things I would do differently next time.


Featured Posts
Recent Posts
Search By Tags
No tags yet.
Follow Us
  • Facebook Classic
  • Twitter Classic
  • Google Classic
bottom of page