What is it about?
This paper presents a new system for running Mixed Reality (MR) applications that are processed remotely but still feel fast and responsive to users. MR apps are becoming more advanced, with real-time interaction, sharp visuals, and support for multiple users in different locations. To work smoothly, they rely on powerful networks like 5G/6G and edge cloud systems that handle the heavy processing tasks. The system we propose is designed to support these needs. It includes remote rendering, 3D simulation, and real-time environment detection. We built a prototype to test the idea. We ran experiments on a Beyond 5G testbed and also tested the app with users during a student competition. The results show that with the right combination of smart network controls and in-app techniques, it's possible to keep delays low and deliver a high-quality user experience.
Featured Image
Photo by My name is Yanick on Unsplash
Why is it important?
Handling fast interactions in a Mixed Reality environment is already a challenge on its own — and it becomes even more difficult when remote rendering is involved, as current solutions often struggle with collisions and interactions. In this paper, we introduce a new approach to address these issues by applying Object-Centric Rendering, which allows the system to react more quickly to user movements. In addition, there is skepticism about whether new 5G/6G networks can support multi-user Mixed Reality systems. To address this, we conducted measurements that clearly show these networks can meet the latency requirements, even in demanding multi-user scenarios.
Read the Original
This page is a summary of: Enablers of low-latency immersive interaction in future remote-rendered Mixed Reality applications, March 2025, ACM (Association for Computing Machinery),
DOI: 10.1145/3712676.3714448.
You can read the full text:
Contributors
The following have contributed to this page







