Research

Extending Reality

With the goal to facilitate human-technology partnerships at future workplaces, Dr. Kim investigates alternative methods for the requirements’ analysis, design, prototyping and evaluation of extended reality applications.

A man wearing a Augmented Reality device

Department of Industrial and Systems Engineering

icon of a calendarDecember 20, 2023

Pencil IconBy Arina Bokas

Extending Reality

Augmented reality (AR) presents an exciting opportunity to fundamentally redesign how human beings interact with information, making it highly desirable technology in many industrial sectors. Despite all excitement, however, it is still not clear how the perception of computer-generated information overlaid atop physical reality affect people’s experiences and performance. Utilizing OU’s Human-Centered Engineering Laboratory and Augmented Reality Center, Hyungil Kim, Ph.D., assistant professor of industrial and systems engineering, is exploring innovative approaches to understanding human interactions with emerging technologies and transforming this knowledge into the design of human-machine-environment systems.

With the goal to facilitate human-technology partnerships at future workplaces, Dr. Kim investigates alternative methods for the requirements’ analysis, design, prototyping, and evaluation of extended reality applications.

“Yesterday’s approaches to interface design are insufficient to support the new way of interaction. To realize the full potential of this novel technology, I strongly believe that AR must be more usable and useful than it currently is, and in this area, human factors research can have unique and significant contributions,” says Dr. Kim, whose recent work has been focused on examining the human depth perception in AR.

“For example, focal depth mismatch between AR graphics and their real-world referents can cause perceptual problems and require users to switch focus between digital and physical objects. Or AR graphics may cause visual and cognitive distraction by narrowing users’ attention at the time when they must interact with both information on the display and changes in the environment,” he adds.

AR has been aggressively explored as a solution to reduce driver distraction, often achieved by projecting computer generated graphics onto the driver’s forward field of view. AR head-up displays (HUDs) allow drivers to perceive information without taking attention away from the road. However, AR graphics’ salience, frequent changes, and visual clutter can also create distraction.

One of Dr. Kim’s recent studies was focused on developing a novel approach to quantify the distraction potential of AR applications by measuring driver eye glance distribution and awareness of environmental elements.

“This research aimed at developing up-to-date methods specifically for AR HUDs with the goal of informing drivers while minimizing distraction. To accomplish this, we developed a new method for quantifying the visual and cognitive distraction potential of AR HUDs by measuring driver visual attention, situation awareness, confidence, and workload,” Dr. Kim says.

Another aspect of Dr. Kim’s research is centered on human-machine teaming in intelligent transportation systems (ITS). Many vehicles nowadays feature such advanced technologies as driver monitoring systems, advanced driver assistance systems (ADAS), and driving automation systems (DAS). These developments have a potential to improve transportation safety and access to mobility for the transportation-challenged population. As a result, driving requires a collaboration between the human and the machine.

“The problem that often arises is that people don’t use such systems as intended, mostly due to misunderstanding of the systems’ capabilities and limitations. Moreover, the real-world use and impact of this novel technology on transportation safety are not fully understood yet, which is why I evaluate human-machine interactions in ITSs through driving simulator studies, test-track experimentations, and naturalistic driving studies,” explains Dr. Kim, whose transportation research focuses on human-centered approach to the development and evaluation of ITSs.

Supported by such industry partners as Honda, GM, Google, as well as government agencies, Dr. Kim conducted a study to investigate driver interactions with driving automation. The study examined existing natural-driving data collected from fifty participants over 12 months. All participants drove personally owned SAE Level 2 vehicles, which provided partial automation by helping drivers with acceleration, braking, and steering and were equipped with ADAS that could take control over these functions.

The results showed that during 235 safety-critical events, DAS was employed 47 times and people misused it in 57% of the cases. When a DAS feature was activated, for example, drivers were compelled to take over control manually in response to other vehicles. In some scenarios, DASs did not meet driver expectations in typical driving situations, such as approaching stopped vehicles and negotiating curves.

“This study contributes to better understanding of the capabilities and limitations of early production SAE L2 vehicles, the prevalence of the unintended use of DASs, and drivers’ perceptions of these new technologies. We hope that the finding may inform the development of human-machine interfaces and training programs to reduce the unintended use of DASs and associated potential safety consequences,” says Dr. Kim.

Most recently, Dr. Kim and his team, which now includes two doctoral students, Shruthi Venkatesha Murthy and Ahmad Albawa'neh, and AI master’s student, Zaid Abdelfattah, developed an in-house virtual reality driving simulator based on CARLA in the Human-Centered Engineering Lab. CARLA provides open digital assets (urban layouts, buildings, vehicles) that were created to support development, training, and validation of autonomous driving systems. The team’s plan is to use this simulator and a game development platform, Unreal Engine, to prototype various multi-modal human-machine interfaces and study human-machine teaming in future transportation systems.