Jose James
Haptics-centered Multi-modal Human-Robot Interfaces

March 6, 2026
1:00 pm

2:00 pm
Location: E101

Abstract

The growing demand for immersive, intuitive, and physically meaningful interaction in Virtual Reality (VR) and Human–Robot Interaction (HRI) underscores the limitations of traditional visual–auditory interfaces. By integrating tactile and kinesthetic haptic feedback with synchronized audio‑visual cues, this work enhances realism and interactivity in both virtual and remote environments. We present a unified framework for haptics‑enhanced multi‑modal, multi‑sensory interfaces designed to improve digital twin interaction and motor skill training. Elevating touch to a primary modality addresses a critical gap in current VR and teleoperation systems, where visual feedback alone cannot convey essential physical properties such as texture, stiffness, or weight.

Our initial pilot study focuses on vibro‑tactile texture perception, demonstrating how enriched tactile cues improve user engagement and perceptual accuracy. The proposed system integrates bilateral force feedback, high‑fidelity cutaneous actuation, vibro‑tactile rendering, and a novel sensory fusion strategy that prioritizes haptic information during contact‑rich events, mitigating issues such as visual occlusion and video latency. This approach enhances embodiment and realism when interacting with digital twins used for simulation, training, and remote operations.

Comparisons with vision‑dominant baselines show measurable gains in task completion time, force regulation accuracy, and cognitive load reduction. Looking forward, integrating advanced AI techniques will enable adaptive synchronization of multi‑modal feedback, supporting next‑generation enactive interfaces with broad implications for surgical robotics, hazardous material handling, remote exploration, and high‑fidelity digital twin applications.

Registration Form

Guest Info

First Name *
Email *
How did you hear about this event?
Number of guests who will attend
Last Name *
Phone Number
What is your relationship to LTU?

Upcoming Events

» Document Viewer

Use Your Cell Phone as a Document Camera in Zoom

  • What you will need to have and do
  • Download the mobile Zoom app (either App Store or Google Play)
  • Have your phone plugged in
  • Set up video stand phone holder

From Computer

Log in and start your Zoom session with participants

From Phone

  • Start the Zoom session on your phone app (suggest setting your phone to “Do not disturb” since your phone screen will be seen in Zoom)
  • Type in the Meeting ID and Join
  • Do not use phone audio option to avoid feedback
  • Select “share content” and “screen” to share your cell phone’s screen in your Zoom session
  • Select “start broadcast” from Zoom app. The home screen of your cell phone is now being shared with your participants.

To use your cell phone as a makeshift document camera

  • Open (swipe to switch apps) and select the camera app on your phone
  • Start in photo mode and aim the camera at whatever materials you would like to share
  • This is where you will have to position what you want to share to get the best view – but you will see ‘how you are doing’ in the main Zoom session.