ICVFX Stage and Rig Build-out

📅 Timeline: July - October 2024
🏢 Company: Steady Now Productions
Back to Projects

Project Overview

I was tasked with researching and implementing a solution to transform our existing green screen stage into a state-of-the-art virtual production facility. Working with a limited budget, I needed to identify cost-effective technologies that could deliver professional results for live broadcast productions.

This project involved extensive research into camera tracking systems, virtual set software, and hardware integration to create a seamless In-Camera Visual Effects (ICVFX) pipeline that would allow for real-time compositing of talent with virtual environments.

Camera Tracking
Unreal Engine
Aximmetry
RETracker
Virtual Production
ICVFX

The Challenge

Converting a traditional green screen stage to a virtual production environment presented several challenges:

Research and Implementation

Initial Research Phase

After evaluating numerous options within our budget constraints, I initially selected a floor-based version of Antilatency as our camera tracking system and the Unreal Engine-based version of Aximmetry as our virtual set software. These choices were based on a balance of cost-effectiveness and feature set. I mounted the Antilatency tracker onto our Sony FX9 camera, and set up the tracking floor. I also calibrated the tracker and lens and created lens profiles.

Picture of a green screen stage with an Antilatency tracking floor
Picture of the Antilatency tracking device on the front of the camera

Hardware Implementation

I researched specifications and purchased two high-performance rendering PCs to run Aximmetry on independent computers. This setup provided backup capabilities for live productions, ensuring we could quickly switch to a backup system in case of any hardware issues during broadcast.

Unreal Engine Integration

I became deeply familiar with Unreal Engine, learning to build and optimize virtual environments specifically for broadcast use. This involved working with freelance technical artists to create and implement sets with proper scale and perspective, optimizing for real-time rendering, and ensuring the visual quality met broadcast standards. I also set up a Perforce server for remote collaboration and change tracking.

Challenges with Initial System

During testing, we encountered stability issues with the Antilatency tracking system. While it performed well in controlled tests, it proved less reliable during extended production scenarios. This led me to conduct additional research to find a more robust solution.

RETracker System Implementation

After thorough evaluation, I selected RETracker Bliss as our new camera tracking solution, paired with RETracker Fizz integrated with a Tilta Nucleus-M system for Focus, Iris, and Zoom encoding. This combination offered superior reliability and precision. Additionally, I mounted an ArUco tag to the ceiling. This was to provide the tracker a repeatable origin point.

Close-up picture of the RETracker Bliss camera tracker and Tilta Nucleus-M encoders
Picture of a large ArUco board that is zip-tied to the lighting grid above the stage

Production Ready System

The upgraded system was implemented and thoroughly tested. RETracker proved to be rock-solid through multiple live events, maintaining accurate tracking and providing the stability needed for broadcast production.

Technical Details

Camera Tracking System

The RETracker Bliss system uses two fisheye cameras to calulate the position and rotation of the sensor using Visual Simultaneous Localization and Mapping (vSLAM). This data is exposed as FreeD data and is fed to Aximmetry rendering computers over the network with minimal latency.

Picture of what the RETracker Bliss fisheye cameras are seeing and the resulting tracking points

Lens Data Integration

The RETracker Fizz system paired with Tilta Nucleus-M provides accurate Focus, Iris, and Zoom (FIZ) data. This integration allows for proper depth of field and focal length matching between the physical camera and the virtual environment.

Virtual Set Software

Aximmetry serves as the bridge between our tracking systems and Unreal Engine, managing the real-time compositing of talent with virtual environments. It outputs the final comp over SDI snd into our Tricaster video switcher.

Enhanced Virtual Elements

As the system matured, I added more advanced elements to our virtual environments, including dynamic screens showing real-time data, interactive graphs, and other visual elements that enhanced the storytelling capabilities of our productions.

Results and Impact

The completed ICVFX stage has become a cornerstone of Steady Now's production capabilities, allowing Steady Now to create sophisticated virtual environments for broadcasts that would have been prohibitively expensive with traditional methods.

The system has successfully supported multiple live events, including major events for IBM. The system has demonstrated exceptional reliability and visual quality. The ability to make real-time adjustments to virtual environments has given Steady Now's productions unprecedented flexibility and creative freedom.

Previous Project: IBM Sales Kickoff Next Project: Control Room Build-out