Virtual Data Capture

Virtual Production Data Capture
Dreamspace addresses the data capture required to integrate real and virtual elements live on-set. Ncam is researching new techniques for camera tracking and depth capture that would allow live action content to be mixed with a virtual environment in real-time without the need for an expensive dedicated studio. iMinds is researching new techniques for location capture using omni-directional video with depth that would create photo-real virtual environments without the need to build expensive digital virtual sets.

Real-time Depth and Tracking
Live visualization on-set and final delivery
Ncam have developed a live depth capture system to estimate scene depth for a live action plate together with real-time camera tracking data to provide a live preview that integrates real and virtual elements in depth. This provides correct placement of foreground actors in a virtual scene rather than just an overlay, allowing greater understanding of the shot composition to help ensure the delivery of the correct shot more quickly and easily.
Ncam have developed a prototype offline tracking system, called Final Track, to automate the delivery of final quality tracks based on the live camera tracking data and principal footage recorded on-set. This reduces the need for manual match-moving in post-production, helping to reduce the time and cost to deliver shots.

Photo-real Virtual Sets
pre-production
iMinds have developed a modular camera system to capture physical environments using different camera configurations. Depth-aware image stitching has been introduced to generate seamless omni-directional video and new techniques for view synthesis in sparse lightfield camera arrays have been developed to generate novel views and support parallax with captured video. This provides the ability to capture a physical environment as a virtual set, reducing the need for the expensive and time-consuming process of generating photo-real digital sets for use in production.