facebook

Campus Address

Lawrence Technological University
21000 West Ten Mile Road
Southfield, MI 48075-1058

Important Phone Numbers

Toll-free
1.800.CALL.LTU


Campus Hotline
248.204.2222


Campus Operator / Directory Assistance
248.204.4000

research-mag-robotics-header
robotics_mobile

College of Arts and Sciences

Robotics Research for Safer Roads, Better Health

“Using Generative AI (Artificial Intelligence), I'm generating datasets to train my cars to see and hear better. I want my car to recognize sirens, for example.”

– DR. CHAN-JIN (CJ) CHUNG
   COLLEGE OF ARTS AND SCIENCES

“Currently, my car doesn’t have ears!” said Professor of Computer Science and Director of the Computer Science Robotics Labs Dr. Chan-Jin (CJ) Chung . He is referring to LTU’s autonomous electric vehicle that is the subject of much of Chung and his students’ research. “Using Generative AI (artificial intelligence), I’m generating datasets to train my cars to see and hear better. I want my car to recognize sirens, for example.”

Dr. Eric Martinson , associate professor and interim chair of the Mathematics and Computer Science Department, has an extensive background in robot perception, including robot listening. They plan to work together to develop a sound recognition system that Chung will incorporate in his research to enable his car to “hear.” Chung explained, “I need to train my car to recognize or ‘see’ rare or unusual objects on the road, like police cars, a cart with a donkey, animals, as well as train sound sensors to recognize when an ambulance or fire truck is approaching and to control the car properly. Right now, we rely on our own ears to detect these warning sounds.”

This research focus will be part of an NSF REU (National Science Foundation Research Experiences for Undergraduates) grant that Chung was awarded in 2022. He is working on renewing that grant to add:

Image Description
  • The development of deep learning-based algorithms for self-driving at night when it is raining. "With the combination of slippery roads, and night driving or impaired vision, most drivers, including me, have difficulty driving at night when it is raining," admitted Chung.
  • The creation of intelligent functions/interfaces for elderly and handicapped drivers by integrating with customized LLMs (large language models).

In another research endeavor, Chung is “customizing (fine tuning) LLMs (Large Language Models) for robotic-assisted surgery," he said. “For example, LLMs for prostate cancer surgeries can be used to extract data and medical information about the surgery and the patient in real time via the doctor's natural language voice commands. Surgeons can use the information during the surgery to make necessary decisions in real time. My ultimate goal is to get a da Vinci robotic surgical system. The LLMs can be connected to the system.”

“How can they [domestic robots] support the elderly and people with disabilities so they can continue to live in their homes safely with the use of a mobile manipulator?”

– DR. ERIC MARTINSON
   COLLEGE OF ARTS AND SCIENCES

Martinson is also interested in solving some of the current problems with domestic robots, particularly related to health care, and how can they support the elderly and people with disabilities so they can continue to live in their homes safely with the use of a mobile manipulator. This means moving beyond vacuuming to other household activities. His students are currently investigating how to program LTU’s Stretch 2 robot to pick up laundry from the floor and retrieve recyclable bottles and return them to the recycling bin. “One of the big issues with trying to get these robots into the home,” said Martinson, “is deploying machine learning models to these platforms because we have to rely on publicly available data to make it happen.” For instance, the thousands of available images of a t-shirt found in public datasets are open and flat, which is not what a crumpled blue t-shirt looks like in a pile of other objects.

Image Description

Dr. Eric Martinson (far left) poses with the Stretch 2 robot and students (l to r) Devson Butani, Jacob Hallett, Annalia Schoenherr, and Kaushik Mehta.

More generally, Martinson explained that the images available online are often not very good for training our robots to recognize and interact with the generic objects around the house. But new methods for simulating environments can help. “Neural Radiance Fields (NeRFs)”, he said, “[are] a system that allows you to capture a video of your home and tries to reconstruct images, not only in color but in depth.” Coupled with 3D models or other generative methods, he proposes to create training data customized to an individual’s home. In the future, Martinson envisions domestic robots, like medical assistants, that can augment their existing machine learning solutions with new models generated automatically during unboxing.

Image Description

In a joint effort with the College of Engineering, the College of Arts and Sciences has purchased robot dogs, the “Big Dogs,” quadrupeds that are programmed to move around in human environments. “They can climb stairs and go over rocks,” Martinson said. As research tools, Dr. Chung and his students have been studying and programming a cadre of small dog robots for some time. Videos of synchronized dancing robot dogs and soccer playing robot dogs demonstrate the capabilities of these smaller robots.

“Advances in artificial intelligence coupled with advances in mechatronics (the combination of machines, control systems, and software),” said Martinson, “that are in play today will enable us to bring the types of safety and convenience we are talking about into our homes.”

TABLE OF CONTENTS

Questions or Comments about this story?  We'd like to hear from you.