Dreamspace Downloads

The Dreamspace project has yielded many technical prototypes in the fields of Virtual Production and Immersive Spaces. Some of which have been made available to the community for further development.

KATANA Benchmark scene

A benchmark virtual set has been prepared in Dreamspace using the ‘San Miguel’ scene, kindly provided to The Foundry by Guillermo Leal. This benchmark scene has sufficient complexity to represent film quality assets in the virtual production pipeline. The scene has been assembled in the Foundry’s post-production look development and lighting tool KATANA, for both on-set visualisation in a real-time renderer as well as final quality rendering in PRMan, Arnold and VRay.

Please contact for access to KATANA and the San Miguel benchmark scene. The asset is also available as part of the example scenes accompanying the book Physically Based Rendering.

KATANA Live-previz

The Dreamspace project developed a proof-of-concept virtual studio system called LiveView that integrates live camera tracking with real-time depth capture, high performance global illumination rendering and flexible real-time compositing, connected to collaborative toolsets to control the virtual environment and light capture to harmonise real and virtual lighting.

Whilst LiveView is not available for distribution, KATANA licenses may be granted for research projects and a KATANA listener prototype has been prepared as a live-previz system. KATANA listener provides a scene server for a virtual set with live communication to the Dreamspace renderer and on-set collaborative tools. To support real-time rendering in live-previz, the Dreamspace real-time renderer called Gonzo developed by The Foundry has also been made available.

Please contact for access to KATANA listener.

Virtual Production Editing Tools

In Dreamspace, Animationsinstitut of Filmakademie Baden-Württemberg have developed a tablet based tool that allows non-technical users control over the set layout, lighting and animation. This provides collaborative control of the highly complex film pipeline through a simple real-time interface. The virtual production editing tools (VPET) have been made open source and are available in three parts.

- VPET scene editing tool Unity project
- A KATANA plugin for distributing the scene over the network to VPET
- Synchronisation Server to enable collaborative editing across multiple devices.

Virtual Production Light Editing Tools

Harmonising or matching the set lighting between virtual and real scenes is a requirement for the production of plausible visual effects (VFX) involving the integration of virtual elements using Computer Generated Imagery (CGI) into real camera images.

The Intel-Visual Computing Institute at University of Saarland developed tools to capture and edit light on-set. We make three tools available:

- dilsea (sample dataset): Discrete Light Source Estimation from light probe images (Python library, application + sample dataset)
- vpetdmxserver: DMX light server to control lights from VPET (Python library)
- LightCapture: An Android app to connect an external USB camera to a Project Tango enabled tablet/phone for capturing light probe images

Please also visit the project page for more info, including technical papers!

High Performance Rendering

The Universität des Saarlandes have developed a scalable high-performance global illumination renderer in Dreamspace that allows the visualization of final quality rendering as part of a live session on-set. This removes the need to bake-in fixed lighting into a virtual scene, allowing live adjustment of virtual lights and creative lighting decisions at the time of shooting. The renderer has been made available for use with KATANA live-previz and the VPET on-set tools.

- Render Node Linux Image
- Render Plugin for KATANA Integration
- Setup Documentation

Immersive Visualisation

Dreamspace has explored the creative impact of new types of media experiences built using virtual production technologies. iMinds have developed a prototype visualization environment that allows filmed real-world imagery to be projected onto a physical space to create an immersive view without the need for a head-mounted display. The toolchain built on Unity has been released as open source.