MAX-R: Mixed Augmented and Extended Reality Media Pipeline

Extended reality, or XR, has become an integral part of almost all forms of entertainment and the MAX-R international collaboration is at the forefront. MAX-R has developed pioneering tools for making, processing and delivering maximum-quality XR content thus impacting media pipelines across the board. 

The MAX-R 30-month Innovation Action, co-funded by the European Union and Innovate UK, united 11 of Europe’s leading creative and immersive media organisations to deliver innovations and new efficiencies for Mixed, Augmented and Extended Reality Media pipelines. 

The practical applicability and transformative potential of the MAX-R innovations in real-world scenarios shown here, centre around High-end Virtual Production, Integrated Remote Virtual Production, Open-Source Tools, Open Access to Data, interoperability of Metadata, Site specific XR experiences and Massively Interactive Live Events.

Interactive Technologies research group
 

UPF-GTI

Universitat Pompeu Fabra - Interactive Technologies group

Open-Source XR Creation Tools

Developed "Rooms," an open-source platform for creating 3D content on XR hardware, aiming to democratize 3D modeling and animation. They also developed wgpuEngine, an open-source game engine for desktop, XR, and 3D web applications, supporting modern graphics techniques. This engine was integrated with other MAX-R partners' projects.
 

Wireless Networking research group
 

UPF-WN

Universitat Pompeu Fabra - Wireless Networking group

Interactive VR Wi-Fi Streaming

Focused on advancing interactive VR streaming over Wi-Fi. They optimized VR streaming parameters and Wi-Fi settings for large-scale, multi-user XR experiences using Wi-Fi 6. They also developed an Adaptive Video Bitrate algorithm to handle Wi-Fi fluctuations and improve the VR streaming experience.
 

Disguise

Disguise

Streamlined XR Production Tools

Developed tools to streamline XR production, including RSConnect for data/metadata transmission, OS Manager for updating rendering servers, and Depth Reprojection for enhanced camera use in hybrid productions. They also implemented OCIO algorithms for color control and integrated an HTTP SockPuppet API with their Porta tool for better control in broadcast XR. 
 

FOUNDRY Visionmongers

Foundry

Real-Time VFX & Metadata

Focused on virtual production, developing a real-time compositing engine (Fission) for LED volumes. They augmented this with machine learning models and "The Vault" metadata capture tool to streamline post-production. Foundry also developed OpenAssetIO, an open standard to improve interoperability between content creation tools and asset management systems.
 

 FilmLight GmbH

FilmLight

Real-Time Colour Grading

Created Quasar, a lightweight colour grading system for real-time applications like Unreal Engine. Quasar features an intuitive interface, Truelight Colour Spaces for colour management, BaseGrade for linear light space grading, and XGrade for localized edits, simplifying the matching of virtual and live-action elements.
 

BRAINSTORM MULTIMEDIA SL

Brainstorm

Integrated Virtual Production

Integrated MAX-R XR tools into its InfinitySet virtual production engine. This includes a Color Correction Plugin integrated with Filmlight's Quasar, integration of VPET for real-time scene adjustments via tablet, and an enhanced virtual teleportation system with a cost-effective synchronized tracking solution. They also created a library of virtual assets for testing.
 

ARRI

ARRI

Production Metadata Integration

Focused on production efficiency through reliable metadata and integrated workflows. They advanced all-IP production workflows and developed IP control & metadata protocols (CAP and MCP) to collect extensive on-set metadata with temporal consistency, validated within MAX-R XR workflows. This metadata aids in reconstructing camera movements in virtual scenes, reducing costs and speeding up operations. 
 

 Improbable Worlds Limited - United Kingdom

Improbable

Massive Interactive Events

Developed technology allowing over 10,000 live participants to interact in the same digital space. Their networking and rendering advancements enable large-scale interactions and smaller spatial audio groups. They released a self-serve platform for Unreal Engine developers to create massive interactive live events (MILEs), used by partners like the BBC.

British Broadcasting Corporation - United Kingdom

BBC

Immersive Live Event Streaming

Developed methods for capturing live performances for interactive exploration in immersive virtual spaces using conventional video cameras and streaming. Their system dynamically switches camera views based on avatar positions and synchronizes virtual lighting with real-world lighting. They successfully streamed a live concert into an interactive virtual world with Improbable. 

Filmakademie Baden-Württemberg

Open-Source XR Toolkit

Released open-source tools for XR media production based on their TRACER FOUNDATION framework: VPET (Virtual Production Editing Tools) for real-time scene editing, DataHub for communication infrastructure and interoperability, and AnimHost for connecting animation generators. They enhanced VPET for Brainstorm and Disguise systems and used it in their DIGITAL LOCATIONS tool. AnimHost explores ML-driven XR animation workflows. These tools are available on GitHub. 
 

Universiteit Hasselt - Belgium

Hasselt University

Advanced XR Tracking & Streaming

Developed advanced XR tracking tools for seamless virtual-real synchronization in large spaces, supporting various XR headsets. A mobile version turns real-world objects into VR windows. This technology is applicable beyond entertainment, such as visualizing 3D BIM metadata in AR for industries like construction. They also created a framework for progressively streaming XR graphical content using GLTF over HTTP/3, allowing users to experience scenes faster.   

CREW

CREW

Live Large-Area XR Performance

Developed innovative large-area, participative live XR performances like "Anxious Arrivals" (site-specific XR) and "Alert" (industry evacuation simulation). These tested new technologies and narrative potential with real audiences. They collaborated with UHasselt on a vendor-agnostic large area tracking solution and used wireless streaming for high-quality VR, testing and optimizing with UPF. 

 

Department of Engineering and Information and Communication Technologies

Roc Boronat building (Poblenou campus)
Roc Boronat, 138
08018 Barcelona

Follow us 

  TwitterLinkedinYouTube

 

Inforegio - Download centre for visual elements

This project that has received funding from the European Union´s Horizon Europe research and innovation program under the Grant Agreement No. 101070072