Revolutionizing Urban Transport: Exploring Human-Autonomy Interaction with Digital Twin Technology

Researchers from Clemson University developed an immersive digital twin framework to facilitate safe interaction between autonomous and human-driven vehicles, leveraging mixed-reality interfaces for enhanced testing and coexistence. This innovative system bridges real and virtual worlds, promoting safer and more efficient urban transportation.


CoE-EDP, VisionRICoE-EDP, VisionRI | Updated: 26-06-2024 15:24 IST | Created: 26-06-2024 15:24 IST
Revolutionizing Urban Transport: Exploring Human-Autonomy Interaction with Digital Twin Technology
Rrepresentative image

The deployment of autonomous vehicles (AVs) alongside human-driven vehicles necessitates harmonious coexistence on the roads, demanding mutual understanding and coordination. Researchers from Clemson University have developed a novel immersive digital twin framework to address this challenge, facilitating experimentation with the interaction dynamics between autonomous and non-autonomous traffic participants. This framework employs a mixed-reality human-machine interface that allows human drivers and autonomous agents to observe and interact with each other in edge-case scenarios safely. Known as the AutoDRIVE Ecosystem, the framework bridges the gap between the real and virtual worlds. It integrates physical and digital assets in real-time, enabling safe and reliable exploration of human-autonomy coexistence. This dual benefit allows autonomous vehicles to be tested and improved with human-in-the-loop scenarios, and human drivers to become more familiar with autonomous technology.

Seamless Integration of Physical and Digital Worlds

The versatility of the framework is validated through user experience experiments, employing four different levels of immersion and distinct user interfaces. A case study on uncontrolled intersection traversal demonstrates the efficacy of the framework in validating interactions between human-driven, autonomous, and connected autonomous vehicles. The proposed framework, which is openly available, aims to guide future research on human-autonomy coexistence. The digital twin framework consists of several components: physical and digital twins of traffic participants, observation modalities, and interaction modalities. It bridges the real2sim and sim2real gaps, facilitating vehicle autonomy and smart-city infrastructure. Key features include digital twinning, mixed reality, human-in-the-loop interactions, vehicle-to-everything (V2X) communication, platform-agnostic tools, modular architecture, and an open-source approach.

Real-World Testing with OpenCAV

The research employed the Open Connected and Automated Vehicle (OpenCAV) from Clemson University for testing. This vehicle is equipped with a comprehensive sensor suite and a robust computing platform, enabling real-time data collection and interaction with the digital twin. The digital twin of OpenCAV was calibrated against its physical counterpart, enabling accurate simulations and co-simulations with virtual environments. The user study involved 16 participants with varied demographics and driving, gaming, and virtual reality (VR) experience levels. The study used a mixed factorial design to test four observation and four interaction interfaces. Results indicated a preference for dynamic head-mounted display (HMD) combined with a driving rig, which provided the highest user involvement, sensory fidelity, and user adaptation/immersion. Static HMDs caused confusion due to non-responsive head movements.

Future Directions and Research Opportunities

In a specific case study, an edge-case urban driving scenario was devised to evaluate the serviceability of the framework. The scenario involved a semi-autonomous vehicle cutting across the ego vehicle at an intersection, requiring panic-braking maneuvers. This study compared the performance of different configurations, including autonomous, connected autonomous, and human-driven vehicles. The connected autonomous vehicles (CAVs) performed better due to early detection of peer vehicles through V2V communication, while dynamic HMD human drivers reacted more effectively due to better observation. This comparative analysis used key performance indicators such as position, speed, throttle, and brake commands, which indicated reaction effort and time.

The overall findings highlighted that all configurations safely avoided collisions with the virtual peer vehicle, though their performance varied. Autonomous vehicles performed the most aggressive acceleration and braking but traveled the farthest before coming to a safe stop. The CAV, on the other hand, stopped much earlier and more smoothly due to its ability to detect the peer vehicle ahead of time through V2V communication. Human drivers' performance was influenced by the observation interface, with dynamic HMD users performing the best in terms of stopping distance due to their ability to observe the approaching peer vehicle by turning their heads.

This immersive digital twin framework facilitates the safe exploration of interactions between autonomous and non-autonomous traffic participants. It addresses the limitations of both real-world and simulation-based methodologies by providing a robust platform for safe and reliable testing. The findings from the user and case studies underscore the effectiveness and adaptability of the framework, laying a solid foundation for future research in the field of autonomy-oriented digital twins and the coexistence of human drivers and autonomous systems. Potential avenues for further research include developing innovative immersion techniques along the reality-virtuality continuum, incorporating haptic and auditory feedback to expand the framework, involving more participants in user studies to improve statistical significance, and validating more complex driving scenarios with different scales and configurations of vehicles. This framework's open-source nature and modular architecture are expected to guide future developments in the realm of digital twins and the broader research on human-autonomy coexistence.

  • FIRST PUBLISHED IN:
  • Devdiscourse
Give Feedback