top of page

Green Screen and Mocap Experiment WIP - Man V Metahuman

To enable me to do some virtual production experiments at home, I made a green screen space at home. It's basic but good enough for simple tests. Having completed the intro to cinematography unit at work, I can now borrow and operate film gear. I opted just to shoot on my personal a6400 to keep it simple, but I borrowed out a few lights and some sound recording equipment.

I've decided to revisit a little idea I had during lockdown, which I posted to my Instagram, of filming a fight against myself. But this time, with mocap and an Unreal scene. I have a Rokoko motion capture suit so I can also record that at home, but it was a real challenge to choreograph and shoot and I'm needing to do a lot of adjustments to the animation to fix timing and angles.


I designed the choreography around technical tests I wanted to do, every beat is something I wanted to try or a reversal of that theme, including making and working with sugar glass, which I hadn't done before. That took a few attempts (but at least the failures left me with a lot of hard candy and caramel for a while) and in the end I needed to actually make a silicon mold in order to get anything at all usable and it came up very brown still, there are better ingredients to use to avoid that but they were time and cost prohibitive.

I got some help with the setup dialogue for the scene from my friend and colleague Nat Texler and got my neighbour Sam to break the glass over my head (which they enjoyed doing a little too much) but otherwise ran the whole thing solo. It was always going to be extremely rough as a result of that, but the goal from it is to learn and to maybe have something interesting for a couple of seconds of my stunt reel.


I had a really terrible time with shooting like that and an even worse time compositing it into Unreal.

My first attempt at a composite using the composure system in Unreal was a complete failure. My lighting on the green screen had been less than ideal so comping it out was impossible and for some reason working on it in sequencer would constantly crash Unreal. I shifted tactic and did the composite first in Davinci and brought the EXR sequence into Unreal and put it onto a media plate in the scene. The colour space got messed up doing that, but it only needs to serve as a guide for me to get the UE scene and animation right.


This served to be much more reliable, but I hadn't tracked the camera when I was shooting - I only have access to an iPhone for tracking at home and given I wasn't going to be moving the camera (and my capture card was playing up so I couldn't live comp anyway) I didn't see too much benefit - I was very wrong.


My current hurdle is lining up my in engine camera to match the equivalent position in space (thankfully I did record my focal lengths at least and match my camera in UE) to match the footage I shot. Once that's done I can get back to refining the background environment and characters and then get to cleaning up the animation to export for a final comp in Davinci.


Featured Posts
Recent Posts
Search By Tags
No tags yet.
Follow Us
  • Facebook Classic
  • Twitter Classic
  • Google Classic
bottom of page