Calendar - University of Houston
Skip to main content

UH Calendar

[Defense] Towards Real-Time Collaborative Immersion with Large-Scale Virtual Environments and Digital Twins via Multiuser, Multimodal Holographic AR/XR Interface

Thursday, April 24, 2025

1:00 pm - 2:00 pm

In Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy

Khang Tran

will defend his doctoral proposal

Towards Real-Time Collaborative Immersion with Large-Scale Virtual Environments and Digital Twins via Multiuser, Multimodal Holographic AR/XR Interface

Abstract

Digital twins are increasingly used in all parts of physical entity lifecycle. Digital twins provide intensive data of the physical counterparts for remote real-time observations and management. Augmented reality (AR) enables immersion into multi-dimensional data. High-quality and dynamic digital twins in AR platforms can positively impact many fields, including education, training, research, clinical practice, manufacturing, automotive and others. To create such digital twins, collaboration between multiple software platforms is crucial. While many file formats are capable of storing 3D digital twin models, 3D data sharing and collaboration between different applications to harness the power of each one in order to create high-quality multidimensional digital twins have always been challenging. It requires a common file format to enable real-time and efficient processing along the pipeline. Universal Scene Description (USD) is an open-source file format for robust and scalable interchange and augmentation of 3D scenes from various sources. It was also recently adopted by NVIDIA into its Omniverse platform and standardized by the Alliance for OpenUSD (AOUSD). The use of high-fidelity USD can be transformative in interactive immersion of researchers and end-users into 3D/4D data. AR immersion into USD digital twin data requires communication of servers or cloud facilities with the users’ head-mounted display (HMD) or hand-held display (HHD) devices. In this PhD work, I present a software application to bridge the gap between multi-modality AR immersion and USD data, which offers real-time multi-user AR immersion into USD data to enhance the potential usage of USD in digital twins and enables the AR immersion experience for time-critical tasks and workflows. Using the software application as the base, an application is being developed in astronaut training to experiment with, test, and evaluate the software application.

Thursday, April 24, 2025
1:00 PM - 2:00 PM

PGH 501B

Dr. Nikolaos Tsekos, research advisor

Faculty, students, and the general public are invited.