The campus will remain closed until 12 noon Thursday, 02/13/25. Students should log into Canvas for specific class information from their instructors. Please contact event organizers for information on specific activities. Normal operations will resume at 12pm on Thursday.

Robotics Research for Safer Roads, Better Health

“Using Generative AI (Artificial Intelligence), I’m generating datasets to train my cars to see and hear better. I want my car to recognize sirens, for example.”

– DR. CHAN-JIN (CJ) CHUNG
COLLEGE OF ARTS AND SCIENCES

“Currently, my car doesn’t have ears!” said Professor of Computer Science and Director of the Computer Science Robotics Labs Dr. Chan-Jin (CJ) Chung. He is referring to LTU’s autonomous electric vehicle that is the subject of much of Chung and his students’ research. “Using Generative AI (artificial intelligence), I’m generating datasets to train my cars to see and hear better. I want my car to recognize sirens, for example.”

Dr. Eric Martinson, associate professor and interim chair of the Mathematics and Computer Science Department, has an extensive background in robot perception, including robot listening. They plan to work together to develop a sound recognition system that Chung will incorporate in his research to enable his car to “hear.” Chung explained, “I need to train my car to recognize or ‘see’ rare or unusual objects on the road, like police cars, a cart with a donkey, animals, as well as train sound sensors to recognize when an ambulance or fire truck is approaching and to control the car properly. Right now, we rely on our own ears to detect these warning sounds.”

This research focus will be part of an NSF REU (National Science Foundation Research Experiences for Undergraduates) grant that Chung was awarded in 2022. He is working on renewing that grant to add:

Image Description
  • The development of deep learning-based algorithms for self-driving at night when it is raining. “With the combination of slippery roads, and night driving or impaired vision, most drivers, including me, have difficulty driving at night when it is raining,” admitted Chung.
  • The creation of intelligent functions/interfaces for elderly and handicapped drivers by integrating with customized LLMs (large language models).

In another research endeavor, Chung is “customizing (fine tuning) LLMs (Large Language Models) for robotic-assisted surgery,” he said. “For example, LLMs for prostate cancer surgeries can be used to extract data and medical information about the surgery and the patient in real time via the doctor’s natural language voice commands. Surgeons can use the information during the surgery to make necessary decisions in real time. My ultimate goal is to get a da Vinci robotic surgical system. The LLMs can be connected to the system.”

“How can they [domestic robots] support the elderly and people with disabilities so they can continue to live in their homes safely with the use of a mobile manipulator?”

– DR. ERIC MARTINSON
COLLEGE OF ARTS AND SCIENCES

Martinson is also interested in solving some of the current problems with domestic robots, particularly related to health care, and how can they support the elderly and people with disabilities so they can continue to live in their homes safely with the use of a mobile manipulator. This means moving beyond vacuuming to other household activities. His students are currently investigating how to program LTU’s Stretch 2 robot to pick up laundry from the floor and retrieve recyclable bottles and return them to the recycling bin. “One of the big issues with trying to get these robots into the home,” said Martinson, “is deploying machine learning models to these platforms because we have to rely on publicly available data to make it happen.” For instance, the thousands of available images of a t-shirt found in public datasets are open and flat, which is not what a crumpled blue t-shirt looks like in a pile of other objects.

Image Description

Dr. Eric Martinson (far left) poses with the Stretch 2 robot and students (l to r) Devson Butani, Jacob Hallett, Annalia Schoenherr, and Kaushik Mehta.

More generally, Martinson explained that the images available online are often not very good for training our robots to recognize and interact with the generic objects around the house. But new methods for simulating environments can help. “Neural Radiance Fields (NeRFs)”, he said, “[are] a system that allows you to capture a video of your home and tries to reconstruct images, not only in color but in depth.” Coupled with 3D models or other generative methods, he proposes to create training data customized to an individual’s home. In the future, Martinson envisions domestic robots, like medical assistants, that can augment their existing machine learning solutions with new models generated automatically during unboxing.

Image Description

In a joint effort with the College of Engineering, the College of Arts and Sciences has purchased robot dogs, the “Big Dogs,” quadrupeds that are programmed to move around in human environments. “They can climb stairs and go over rocks,” Martinson said. As research tools, Dr. Chung and his students have been studying and programming a cadre of small dog robots for some time. Videos of synchronized dancing robot dogs and soccer playing robot dogs demonstrate the capabilities of these smaller robots.

“Advances in artificial intelligence coupled with advances in mechatronics (the combination of machines, control systems, and software),” said Martinson, “that are in play today will enable us to bring the types of safety and convenience we are talking about into our homes.”

Other Stories

» Document Viewer

Use Your Cell Phone as a Document Camera in Zoom

  • What you will need to have and do
  • Download the mobile Zoom app (either App Store or Google Play)
  • Have your phone plugged in
  • Set up video stand phone holder

From Computer

Log in and start your Zoom session with participants

From Phone

  • Start the Zoom session on your phone app (suggest setting your phone to “Do not disturb” since your phone screen will be seen in Zoom)
  • Type in the Meeting ID and Join
  • Do not use phone audio option to avoid feedback
  • Select “share content” and “screen” to share your cell phone’s screen in your Zoom session
  • Select “start broadcast” from Zoom app. The home screen of your cell phone is now being shared with your participants.

To use your cell phone as a makeshift document camera

  • Open (swipe to switch apps) and select the camera app on your phone
  • Start in photo mode and aim the camera at whatever materials you would like to share
  • This is where you will have to position what you want to share to get the best view – but you will see ‘how you are doing’ in the main Zoom session.