Can technologies like an ultra low-latency network, virtual reality, motion capture, and dynamic data visualizations help artists collaborate for live performances over long distances?
Platform: Projection Mapping, Virtual Reality, Augmented Reality, Motion Capture
Methods: Quantitative Research
Project Type: Academic Research, Entertainment
Responsibilities: Research Design, Data Analysis & Visualization, Project Management, Reporting
Background: The Music and Audio Research Lab had a fascinating use case for NYU’s ultra low-latency network and data exchange engine: helping musicians and dancers, who are exquisitely sensitive to precise timing of sound delivery, collaborate when they are oceans apart. The Concert on the Holodeck series of performances were constructed to experiment with different technologies and methods of collaboration between artists in New York, USA, and Trondheim, Norway.
Action: I developed a survey for audience members and performers, and collected and analyzed the resulting data to assess the impact of technical choices on audiences’ and performers’ perceptions. The items and analysis focused special attention to the expert audience members’ sensitivity to specific audio features.
Result: The audience was generally satisfied with the cohesiveness of the performance, with feedback focusing on visual elements and spatial audio mix, especially from experts in the audience. The remote performers felt a low degree of impact due to latency. This feedback was used to refine stylistic choices for music and visual displays, as well as technical changes in later performances.
Links:
Holodeck: A Research Framework for Distributed Multimedia Concert Performances (Publication)