Bringing Down the Fifth Wall: A System for Delivering VR Performances to Large Audiences
Performing Arts Technology
Virtual events provided a vital connective thread through the pandemic, and "Bringing Down the Fifth Wall" will develop methodologies to address the challenges of delivering virtual reality performances to large audiences.
In this project, Professor Anıl Çamcı (SMTD, Department of Performing Arts Technology) and an interdisciplinary team of student researchers have been developing a virtual cinematography system driven by real-time audio analysis and user interactions in VR.
The team has now implemented the majority of the cinematography system for the game engine Unity. This way, the game engine can control a range of virtual cameras, including zoom, dolly, follow, trucking, and panning cameras, to showcase the VR performance procedurally without requiring extensive curation or user input.
This system will facilitate the delivery of VR music performances in traditional concert spaces as well as online settings. The team is currently working on the development of four VR performance pieces to evaluate and showcase the system:
- Professor Çamcı will perform with a hardware modular synthesizer in VR.
- Çamcı will perform a networked VR piece with his former student Matias Vilaplana, who will join the performance remotely from the University of Virginia in Charlottesville while Çamcı is in Ann Arbor.
- Professor Erik Santos (SMTD, Chair of the Composition Department) will perform one of his pieces for vocals, guitar and harmonica in VR.
- Professor Amy Porter (SMTD, Winds and Percussion Department) will perform a piece for flute and electronics composed by Professor Santos. The virtual environment for this piece is being modeled after White Fish Point in Paradise, Michigan.
A public showcase of the system is being planned, for which Professor Çamcı will curate student works from his PAT 443/543 Immersive Media class offered in the Fall.
The work-in-progress system has also been used in a performance project that will be presented at the New Interfaces for Expression Conference later this year. A preliminary showcase of the networked VR performance will take place on May 2nd at the University of Virginia. The project has so far brought together students and faculty from the School of Music, Theatre & Dance, School of Engineering, and School of Information.