Virtual Production Data Processing

Dreamspace addresses the data processing required to produce high quality visualization on-set in production or a performance space and in post-production. UdS is researching new techniques to accelerate rendering to produce a high performance global illumination rendering system. The Foundry is researching new methods to accelerate compositing for real-time performance and to deliver final quality composites directly from set. iMinds is researching new techniques to fuse omni-directional video with depth to support free-viewpoint visualization of a photo-real virtual set.

High Performance Rendering
Live visualization on-set
The University of Saarland have developed a high-performance global illumination renderer that allows the visualization of final quality rendering on-set. This removes the need to bake in fixed lighting using offline simulation allowing the adjustment of virtual lights on-set and creative lighting decisions at the time of shooting. A new compiler framework has been developed with a domain specific library for ray tracing, which improves rendering performance with the potential to reduce offline render times which can run to many hours per frame for production renderers.

Live and Final Compositing
Live visualization and final delivery
The Foundry have developed a prototype live compositor that allows a composite to be authored in NUKE, exported to run standalone for live visualisation on-set, with the same results available through NUKE in post-production.
This provides a configurable compositing system
for live visualisation on-set and connects
the on-set system to post-production.
The Foundry have developed a prototype offline image-processing pipeline in NUKE to combine live action footage and rendered virtual elements based on depth capture delivered from set. This reduces the need for manual rotoscoping in post-production to accelerate the delivery of shots in post from delivered RGBZ data.