A better virtual reality experience
A University of Texas at Arlington computer scientist hopes to understand the quality of users’ experiences in virtual reality (VR) by using artificial intelligence (AI) to quantify physiological reactions to immersive videos.
Ming Li, associate professor of computer science and engineering, earned a $600,000 grant from the National Science Foundation for her research. Yingying Zhu, assistant professor in the department, is co-principal investigator. Wei Li at Georgia State University (GSU) is also working with Li and will receive $250,000 from the grant to investigate how to protect user data privacy without compromising quantification accuracy.
Li and her team hope to build a novel system to assess users’ quality of experience when engaging with 360-degree VR immersive videos. The team will use eye gaze trackers, sensors and internal cameras on VR devices to capture behavioral and physiological data and create a model to assess users’ engagement with the content they are viewing.
For instance, according to Li’s preliminary studies, users who are more engaged in a VR scene tend to look in one direction or focus on one object. The rate at which they blink, their pupil size, how fast they move their head and their facial expressions are also measured. The team will take the sensor and process data and build an AI model that will provide quantitative analysis of the experience, which could be provided in real time to service providers like Meta and YouTube.
“This information could allow video service providers to enhance the end-user experience because they will be able to better engage with their users and render their videos better,” Li said. “If we can assess quality of experience in real time, providers can adjust how much network resource they assign to a particular user and make the quality better per an individual user’s need. It’s not possible in the current framework, but we hope this can be a first step toward that kind of user interaction.”
The GSU group will determine how to keep any shared data secure without sacrificing modeling accuracy.
Li said her work is part of a “bigger vision” that could impact online gaming, video conferencing and other networking services.
“The classic network techniques are designed without thinking of the end user’s perspective or input, but what if user-centered networking is possible?” Li asked. “With this kind of measurement, we can give content providers insight into real-time user experiences. They can adjust their service-provisioning strategies accordingly and the user will play a greater role in future decisions.”
- Written by Jeremy Agor, College of Engineering