Platform Integration and Final Evaluation

The prototypes and toolsets in Dreamspace are integrated in a connected system and evaluated in creative productions to assess the impact in terms of collaboration, efficiency and immersive experience. The Foundry is developing a LiveView demonstrator that combines data capture and data processing controlled by the on-set tools with real-time rendering combining live action content with a virtual set. Stargate defines the user requirements and evaluates performance in a series of field trials.

The LiveView System incorporates the technical innovations in the project into a complete on-set system that connects to a conventional film pipeline.
LiveView (The Foundry) – performs live compositing, combining the render for the virtual scene with the principal camera footage.
Camera Server (ncam) – calculates the live camera track and depth data for the principal camera. An 8Gb fiber network connection to LiveView system enables the streaming of Depth Map, Lens Distortion Map, Primary Camera Image, Camera Tracking and Optical Parameters.
Light Processor (UdS) – calculates the physical light parameters captured by a light probe and provides control over the physical lights on-set using a DMX controller. Light Attribute changes are sent to the LiveView system as XML data over a standard network connection.
VPET (Filmakademie) – provides a collaborative on-set interface to the LiveView system with synchronization of scene updates for set, light and animation parameters. The scene is distributed from LiveView in a binary format over WIFI as devices connect. Scene updates are sent to a synchronization server over WIFI as XML data and updates are broadcast to all connected devices.
High Performance Renderer (UdS) – performs scalable global illumination rendering connected to LiveView to provide final quality rendering as part of the live composite. The scene is serialized and sent over a network connection to a master node, which then broadcasts the scene data to the render slaves, before collecting the results, combining, and sending back to the LiveView system for compositing.

Joint Productions: Battleground
Battleground served the purpose of being a combined production between the user-partners utilising combined efforts and methodology for an integrated evaluation process. It was designed to fully test the final Dreamspace Demonstrator in both aspects of Virtual Production and Immersive Experiences.
BATTLEGROUND is the ultimate TV game show competition and audience experience. In each 1hr Episode contestants compete to see which team has the best robot, agility, weapons, defences and team strategy to defeat the other competitors.
This production created a first demo of this show involving most recent technology and game play for the players and a unique experience for the audience to not only witness on the big screen but also in VR. This production utilised not only the recent tool set from Dreamspace for Virtual Production but also the tools for holographic performance. It created a compelling outlook into the future of television entertainment involving teams of contestants, the live audience on stage and the audience connected online and via TV set at home.