Cyberpunk: A Mixed Reality Experience

Welcome to Cyberpunk: A Virtual Production Experience

00

About

Magnopus collaborated with industry leaders – Metastage, Alcon Entertainment, and Epic Games – to create Cyberpunk: A Virtual Production Experience, offering visitors a window into how emerging technologies can transform sci-fi film production. 

Visitors don MR headsets and enter a virtual soundstage inspired by Blade Runner's (1982) iconic aesthetic. Users are guided through the experience by a volumetric capture of actor Rosa Salazar, star of Alita: Battle Angel (2019), as they see and hear how virtual production tools are shaping the future of cinema. Director Ridley Scott even makes a guest appearance, offering guidance on shot composition within the virtual world. 

We set out to create an engaging and immersive virtual production experience that would captivate and inspire museum visitors. A primary challenge in developing this experience was ensuring high-quality visuals while adhering to the constraints of mobile performance. We addressed issues such as compatibility with volumetric capture and the optimization of heavy visual content for smooth playback. Ultimately, we achieved the right balance, delivering an experience that both looks impressive and performs seamlessly on the Meta Quest 3 headset.

My Role & Contributions

My involvement in this project was multifaceted, drawing upon my core strengths in full-stack XR development, Unreal Engine expertise, technical art, and virtual production workflows. I contributed across various stages, from foundational setup to final polish and client handoff.

Engineering & Foundational Setup

From the project's inception, I focused on establishing a robust technical foundation to ensure smooth development and optimal performance on the Quest 3:

  • Project Initialization: Stood up the initial Unreal Engine project, configured version control using Perforce (P4V), and established build pipelines with TeamCity.

  • Target Platform Optimization: Identified early on the need to leverage Meta's custom Unreal Engine branch. This strategic decision allowed us to harness Quest 3-specific rendering features crucial for maximizing performance and visual fidelity on the mobile chipset.

  • Workflow Enhancement: Successfully implemented Unreal Game Sync (UGS) connected to our source control. This streamlined the engine syncing and building process for the entire team, particularly important when working with a source build of the engine.

  • Legacy Asset Integration: Undertook the complex task of migrating the original, nearly 7-year-old Blade Runner UE 4.17 project assets and systems into the modern UE 5.3 environment.

  • Core Systems Development: Built the initial VR interaction systems and configured essential rendering prerequisites tailored for the experience and target hardware.

  • Documentation: Authored initial onboarding documentation to facilitate team ramp-up.

Technical Design & Cinematics

Bridging the gap between engineering and creative vision, my technical design contributions focused on shaping the user experience and narrative flow:

  • VR Experience Design: Developed the foundational VR systems that underpin the user's journey through the virtual soundstage.

  • Cinematic Sequencing: Meticulously crafted the entire narrative flow and cinematic sequence using Unreal Engine's Sequencer, timing events, volcap playback, and environmental changes to create an engaging experience.

  • High-Fidelity Asset Rendering: Created a dedicated workflow and utilized ray tracing within a separate UE5 project to design and render the final-pixel cinematic for the iconic spinner vehicle animation, ensuring the highest possible quality for this key moment.

Technical Art & Optimization

Achieving the desired visual standard on mobile hardware required significant technical art input:

  • Performance Optimization: Rigorously optimized shaders, textures, and meshes throughout the environment to maintain high visual fidelity while meeting the strict performance budgets of the Quest 3.

  • Volumetric Capture Integration: Troubleshot and refined the integration of Rosa Salazar's volumetric capture using the SVFPlugin, ensuring stable and high-quality playback within the engine.

  • Workflow Refinement: Identified and resolved foundational workflow challenges inherited from the original legacy project, improving stability and efficiency.

Client Relations & On-Site Support

Beyond core development, I actively participated in client-facing aspects and deployment preparation:

  • Client Collaboration: Was directly involved in client meetings and demonstrations with the Academy Museum team.

  • Deployment Planning: Contributed technical insights to help plan the necessary hardware, physical space layout, and staffing requirements for the museum installation.

  • Training & Handover: Conducted initial onboarding and technical training sessions for Academy Museum staff to ensure they could operate and maintain the experience.

Working on Cyberpunk: A Virtual Production Experience was incredibly rewarding. It successfully translated a complex, high-fidelity concept into an accessible and performant mobile MR experience, offering museum visitors a unique and inspiring glimpse into the future of filmmaking technology.


year

2025

timeframe

4 months

tools

Unreal Engine, C++, Meta Quest 3, Unreal Game Sync, Unreal Insights, P4V, 3ds Max, After Effects, Premiere

category

Magnopus

01

02

03

see also

.say-hello

want to get in touch?

.say-hello

want to get in touch?

.say-hello

want to get in touch?

.say-hello

want to get in touch?