–
The growing demand for immersive, intuitive, and physically meaningful interaction in Virtual Reality (VR) and Human–Robot Interaction (HRI) underscores the limitations of traditional visual–auditory interfaces. By integrating tactile and kinesthetic haptic feedback with synchronized audio‑visual cues, this work enhances realism and interactivity in both virtual and remote environments. We present a unified framework for haptics‑enhanced multi‑modal, multi‑sensory interfaces designed to improve digital twin interaction and motor skill training. Elevating touch to a primary modality addresses a critical gap in current VR and teleoperation systems, where visual feedback alone cannot convey essential physical properties such as texture, stiffness, or weight.
Our initial pilot study focuses on vibro‑tactile texture perception, demonstrating how enriched tactile cues improve user engagement and perceptual accuracy. The proposed system integrates bilateral force feedback, high‑fidelity cutaneous actuation, vibro‑tactile rendering, and a novel sensory fusion strategy that prioritizes haptic information during contact‑rich events, mitigating issues such as visual occlusion and video latency. This approach enhances embodiment and realism when interacting with digital twins used for simulation, training, and remote operations.
Comparisons with vision‑dominant baselines show measurable gains in task completion time, force regulation accuracy, and cognitive load reduction. Looking forward, integrating advanced AI techniques will enable adaptive synchronization of multi‑modal feedback, supporting next‑generation enactive interfaces with broad implications for surgical robotics, hazardous material handling, remote exploration, and high‑fidelity digital twin applications.
Use Your Cell Phone as a Document Camera in Zoom
From Computer
Log in and start your Zoom session with participants
From Phone
To use your cell phone as a makeshift document camera