IBC paper and blog post from BBC presents results from MAX-R
The BBC team presented a paper at IBC 2024 on Live Music in Virtual Immersive Spaces, which explores innovative methods for enhancing user experiences in virtual environments. The paper highlights results from a user study that investigated "2.5D" techniques for representing live music performances. These methods involve the use of multiple video streams captured from various viewpoints, allowing users to experience the performance from different perspectives. The study examines the impact of varying the number of video streams and evaluates different approaches for switching between them as the user's avatar navigates through the virtual space.
In conjunction with the publication of the paper, the team also shared a blog post on the BBC R&D website, offering additional insights into the process of capturing the music performances used in the study. The blog provides an in-depth look at the technical aspects and challenges faced during the production of the video and audio used for this research.
Additionally, some of the audio and video content captured during the project has been made publicly accessible. You can find more details in the Data section of the MAX-R website, as well as in Deliverable D5.3, which was recently published. These resources offer valuable data for those interested in the technical details of the study or in exploring the captured content further.
Recording a live music performance using several cameras around a stage, incorporating stage lighting and an LED back wall.
A music performance in a virtual nightclub, with the stage represented as a video stream from a camera roughly matching the user's viewpoint