The 68th Frankfurter Buchmesse is the most renowned book‐creative design fair in the world and takes place 19th– 23rd October 2016 in Frankfurt.
Collateral Rooms is an interactive live-art installation, a virtual pavilion for the main exhibition hall. Inspired by the floor plan of the Barcelona Pavilion of Mies van der Rohe, 1929, the futuristic set-up consists of simple walls, channeling a 3D real time architectural space that plays with virtual transparency, mirrors, axis, sight lines and real time composited, material textures. The illusory indoor and outdoor views contain sea views, indoor columns, material of glass, steel and extravagant marble, all adapted to the perspective of the viewer.
C.a.p.e. Drop_Dog is a virtuality performance, inspired by two texts of Dutch writer Tonnus Oosterhoff.
Using state-of-the-art technology, C.a.p.e. Drop_Dog teleports the visitor to the stage and into the core of two short stories. No longer sitting and reading in a comfy seat, the visitor is 'immersed' in a 360° story that develops whilst moving. This continuous process of looking, reading, listening and moving produces a complex reading experience and a pleasantly uncomfortable feeling of being in two different, but parallel, worlds at once.
Collateral Rooms and C.a.p.e. Drop_Dog are at Frankfurter Buchmesse 2016 from 17-23 October 2016, Frankfurt.
NEWS UPDATE: The Dreamspace IBC paper has been chosen to be among IBC 2016’s top 8 papers. Congratulations! This will be announced at a drinks reception on Friday in the Future Zone!!
The Dreamspace booth #8.F05 in the IBC Future Zone, located in HALL 8 showcased key technology components for the project as well as results from the experimental productions and performances that have been conducted in the project.
The Dreamspace project was also presented by Oliver Grau, IVCI director of operations, Intel Germany in the Paper Session: Novel Ideas and Cutting Edge Technologies
Project Dreamspace worked on their last experimental production from 1st to 5th August 2016 at the Filmakademie Studio1, in Ludwigsburg. A wide range of creative professionals worked on the integrated system and technical prototypes developed during the three years of EU funded research, with the support of all Dreamspace partners: Stargate Studios, Filmakademie, CREW, The Foundry, iMinds, University of Saarland/IVCI, ncam
The technology tested was:
• Live depth capture and final track, live tracking and final track
• 360 capture for low cost virtual sets
• Light capture and light control
• Set, animation and light control
• Live compositing pipeline
• Offline depth compositing pipeline
• Final quality rendering and light control
• Continuity in data to post-production
• Immersive visualisation
Longing for Wilderness is a 360° VR-experience created by Filmakademie in close collaboration with a former student Marc Zimmermann in the scope of the VR-Now Initiative. The version shown at Siggraph makes use of Binaural Sound, Subpac, mini windmachine and the Dreamspace VR Player created by Vincent Jacobs in WP5. The experience takes the user from the noisy city through the slowly transforming forest towards a calm and airy landscape. It seeks to express our innate longing to experience nature in its rawest forms. To achieve this, the artists have made use of the latest technology: phones/tablets, virtual reality head-mounted displays (HMDs), 360° imagery, interactive binaural sound, and a seatback tactile bass system to transmit low frequencies to the user’s body, to create a truly immersive experience that addresses all senses. Longing for Wilderness constitutes one of the first use cases of a dedicated 360 Player capable of handling high frame rates (up to 50fps) and resolution (up to 4k), developed within the EU funded project Dreamspace.
Watch Longing for Wilderness
For more info and Publications
LfW_VRVillage_0189.pdf (Adobe PDF - 2.87Mb)
Day one and two of the workshop have been an amazing journey into different stories all about Virtual Reality, including Eric Joris (CREW), among the avant garde of immersive content creation for over 15 years, gave active guidance in how to create immersive contents; Vincent Jacobs (iMinds) showed how to work on custom 360 camera rigs, stitching software and tools for immersive content creation; Oliver Grau, leading Intel’S effort in the Intel Visual Computing Institute, gave some insights on Intel’s Realsense Technology and report on some onset tools developed as part of the Dreamspace project and how they can be used for immersive projects; Volker Helzle and Simon Spielmann (Filmakademie R&D) introduced VPET, a tool for Virtual (Reality) Production. This included a short history of early approaches using VR & AR HMDs and future endeavors; Andrew Daffy (Daffy), Marc Zimmermann (Epicscapes) and Benjamin Rudolf (Nauhau).
The following 24 hours have been a VR Hackathon, where the student teams realised their Virtual Reality projects supported by lecturers.
Testing the Coaltrack is an interactive installation where one user enters an immersive dome. It was presented by Vincent Jacobs and Steven Maesen at the International Biënnale for Architecture in Rotterdam, it consist of 3 videoprojectors beam upon an U-shaped space made out of wood, painted white.
A 360 video, forming the base of the installation, was captured at an area in Limburg (Belgian province) called the ‘Coatrack’ with the latest iMinds rig 3D content is composited in real time onto the video, being independent from it.
The old rail track connected different coal mines and provided a means of transportation for goods and people. Since the closure of the mines in the late 80-ies, these tracks have not been used and are now overgrown by flora. Some parts have been transformed into cycling track through the region. The installation wants to enable architects and project developers to see and feel the potential of the region by providing a 360 immersive experience with overlaid 3D content.
When entering the space, the user finds a stand with a joystick attached to it. Movement of the joystick accelerates or decelerates the playback speed of the 4k 25fps 360 videofile that textures the whole projection environment. In total around 15km of track is filmed so it makes sense to provide the user with the means to FFWD of BKWD. At predefined intervals, 3D content reveals itself providing context about what can be seen in the video at that moment. This can be video or image content. The user can choose to click on the joystick to open a popup with more info or let the info pass.
The video is played back and the joystick are connected through WP5T2 plugins. All content is captured into one 8K equirectangular render-texture on the fly. Warping and blending (forming one seamless image) is done entirely with the WP5T2 tools, using a Ricoh Theta S for calibration (30/45min procedure). Calibration information is fed into Unity which looks up the correct pixel in the 8K render texture and later applies blending where necessary. Only one ‘sweet spot’ is used, the viewer has the correct perspective from one point. From that point, all content appears as rectified content, in other words, from that point the curvature of the physical U-form is completely compensated for by the warping distortion.
Project ‘Skywriters’ is a Filmakademie documentary film about a family business for sky advertisement. In addition to the real filmed footage the team approached the Institute of Animation at Filmakademie for additional computer generated shots. This collaboration allowed the evaluation of the current status of the Dreamspace Live View system integration in combination with ncam real-time camera tracking and the VPET tools. The ncam system was mounted to a virtual camera displaying its views on a 9’’ screen and simultaneously on a large beamer. The director had no previous experience with virtual production technology. After a short introduction, he was able to direct digital assets and animation using a VPET tablet in agreement with the director of photography to design the shots. Time consuming scene preparations and manual animation data alignment were existing limitations at this point of system integration in December 2015. The setup was evaluated as very intuitive and with a high potential to increase creativity.
Dreamspace showcased the latest integrated prototypes at FMX2016 in Stuttgart. The team presented a live demo on the Crytek Virtual Production Stage located at Steinbeis-Saal on 26th April 1:30-5:00pm and on 29th April 2:30-4:00pm.
In addition, the Filmakademie project ‘Longing for Wilderness’ at the FMX 2016 Marketplace booth 1.1 featured DREAMSPACE immersive player technology in a unique Virtual Reality experience.
VPET - Virtual Production Editing Tool is a tablet based onset editing application to work within a virtual production environment. The development is open-source and particularly considers established (offline) film pipelines. It is designed to run on mobile and head mounted devices allowing easy access to operators without dedicated training. It provides functionality to edit assets during a virtual shooting and synchronize changes with the film pipeline. VPET communicates through a network interface to main applications and other clients, allowing live editing of object, light and animation parameters.
Alan Purvis talked Heterogeneous Compute for Real-Time Image Processing Applications at GPU on 5th April 2016. His work is an invaluable contribution to the Dreamspace Project on scheduling and real-time compositing pipelines.
He discussed work carried out at the Foundry on a heterogeneous image processing framework, utilising all available CPU and GPU compute devices within a system. Complex graphs of processing effects can be authored in BLINK, a domain-specific language created in-house. By harnessing data parallelism, knowledge of transfer speeds, and device compute capabilities, we have developed a scheduling system for efficiently deploying workloads across all devices. The talk gave a brief overview of BLINK, how graphs of effects are authored, and the innovative use of our scheduling framework within a hybrid 3D rendering system for virtual production.
The consortium showcased the results of its first year’s work, which focuses on the technical components necessary to enable virtual production. The group presented throughout FMX 2015
See what fxguide had to say.
In a tech preview on The Foundry’s booth, CTO Jon Wadelton unveiled some of The Foundry’s research in the fields of virtual and augmented reality. As well as showing off a new raytracing renderer, the session showcased some cutting-edge work on what The Foundry describes as the five key “pain points” when creating content for virtual reality applications.
Filmakademie had a short paper accepted for presentation at CVMP2014.