Cross-Reality Re-Rendering: Manipulating between Digital and Physical Realities

15 Nov 2022  ·  Siddhartha Datta ·

The advent of personalized reality has arrived. Rapid development in AR/MR/VR enables users to augment or diminish their perception of the physical world. Robust tooling for digital interface modification enables users to change how their software operates. As digital realities become an increasingly-impactful aspect of human lives, we investigate the design of a system that enables users to manipulate the perception of both their physical realities and digital realities. Users can inspect their view history from either reality, and generate interventions that can be interoperably rendered cross-reality in real-time. Personalized interventions can be generated with mask, text, and model hooks. Collaboration between users scales the availability of interventions. We verify our implementation against our design requirements with cognitive walkthroughs, personas, and scalability tests.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here