Live green screen composite development 

In terms of technical setup, the project is being developed using Unreal Engine alongside the Blackmagic Ultimatte 12 keyer and DeckLink card to enable live compositing.

My initial experiments have focused on creating MVPs (minimum viable products) to test the pipeline and establish functionality, using the Hooden Horses and simple virtual environment backgrounds as source material. Early challenges came from Unreal Engine versions not stability supporting DeckLink live capture until Unreal 5.6, but with this release I now have a much more stable platform to build upon.

Reads out case with ultimate eight mini and black magic video assists for capturing notice at the bottom the porting and rooting options for ease of operation

I have tested a variety of combinations, including the use of media plates and letting the external hardware keyer handle the chroma keying rather than relying on Unreal’s in-engine solutions. I’ve also adapted my green screen rig – housed within a flight case – to capture the key, the fill, and the main video feed simultaneously. This allows for recompositing in post-production to produce higher quality media plates rather than relying solely on live output.

At this stage I’ve been testing functionality using a small green screen in a limited space, but the full studio facilities at Screen South will give far superior results and greater flexibility for capture and compositing workflows. This setup also opens up the option of extracting motion capture data from performances to drive Unreal’s MetaHuman characters at a later stage, adding another layer of experimentation for future development