For the first time, the assistance robot GARMI demonstrates that it can directly combine specific skills and support seniors throughout the day. With the help of a digital twin, artificial intelligence and ChatGPT, the care assistant from the Technical University of Munich (TUM) conducts caregiving tasks such as bringing water and breakfast to the bedside, booking medical appointments and setting up and facilitating telemedical examinations. It also helps care recipients to get out of bed and do rehabilitation exercises.
The GARMI assistance robot is becoming increasingly versatile and intelligent. As researchers from TUM's Munich Institute of Robotics and Machine Intelligence (MIRMI) demonstrated at the 2024 International Conference on Robotics and Automation (ICRA) in Yokohama, Japan, the robot not only understands various commands via ChatGPT. It also autonomously implements and executes a wide range of tasks and skills, such as grasping objects, maneuvering safely and communicating with patients. In addition, it books appointments with doctors for telemedical examinations. "GARMI is now able to perform the various individual skills that we have taught him over the last few years, securely and on-demand, via ChatGPT," explains Geriatronics project lead Dr. Abdeldjallil Naceri.
Human-robot interaction: tools must be reliable and safe
To achieve this, the researchers are combining various technological innovations. Before attempting real-world human interactions, a digital twin is used to avoid collisions and make sure that the robot's movements are safe. Artificial intelligence (AI) helps GARMI grasp and hand over cups and glasses without spilling liquid. And ChatGPT acts as a link between the robot, patients, physiotherapists and doctors. Prof. Naceri draws a parallel between innovations in GARMI and autonomous driving: "Before a new function like autonomous parking assistance is made available to real-world drivers, many development steps are necessary." says the researcher. "The same is true for care robotics. Because this technology will be used where people are present, it must be 100% safe and reliable."
Care assistant with ChatGPT, Digital Twin, and human-like hands
TUM researchers have made significant progress especially in three areas:
1. Dexterity - gripping and moving precisely from a distance
Gripping: The researchers have combined a camera, a robotic arm with seven joints, an artificial hand and artificial intelligence to enable GARMI to simulate the way humans grip objects. First, the camera takes a picture of the object to be grasped and identifies it as a cup, cylinder or ball using neural networks. As the camera only sees the object from one side, the system adds non-visible areas, such as a cup, by comparing what it sees with other images and reconstructing a complete 3D object. The researchers use a color-graded heat map indicating the probabilities of various representations of the object matching the way it actually looks in reality. This makes it possible to decide on the ideal position for the hand for gripping a cup, for example. The complex system now manages to do this correctly nine times out of ten. "After it works with one cup, our system can transfer the methodology to all other cup shapes," says Naceri.
Moving objects from a distance:
The researchers devised a special experimental setup to investigate whether doctors can work together with patients via telesurgery. To do this, they drew simple shapes on a digital graphics tablet. GARMI was equipped with a pen in one hand and a camera in the other. One room away, GARMI's task was to transfer the researchers' drawings onto a screen - in other words, project a simple drawing into a complex robotic system. It turned out that the best circles, squares, and triangles were created when GARMI used the camera autonomously. This finding will be incorporated into the collaboration between physicians and patients in the future. For example, it is essential to position ultrasound probes as precisely as possible and to perform movements correctly during rehabilitation exercises.
Perceiving and navigating surroundings: In a new research paper, the researchers show how tools can be maneuvered around objects. The challenge is to keep an eye on distances while being able to correctly assess the mobility of the robot arm with all its joints. If this is successful, the robot can even evade balls thrown at it.
2. Safety: The tactile robot has a 1 millisecond reaction time
GARMI processes information at a cycle time of 1 millisecond (ms). This applies equally to perception, interaction and navigation. The force sensors on the robot arms register the slightest contact and react immediately. If a human accidentally bumps into the robot's arm, it stops within a millisecond for safety reasons. Humans and robots initially meet as digital twins in a virtual environment to rule out accidents. This is essential, as the assistance robot can theoretically reach speeds up to 20 km/h in a care home. In the computer simulation, GARMI uses the Safety Motion Unit to register via sensors and slow down if a person is too near. When the person moves away, GARMI speeds up again.
3. Language: ChatGPT uses a list of commands
The AI tool ChatGPT functions as a translator between technology and humans. It has learned various commands such as "Start rehab," "Show me tomorrow's weather," or "Call the doctor." GARMI uses this tool to communicate with patients. The researchers currently have a list of 15 to 20 commands that trigger certain actions. "Potentially, we can expand it as much as we like," says robotics researcher Naceri. "This will make MIRMI one of the first institutes where robots and humans interact using ChatGPT."
GARMI still learning to cope with the human environment
The new universal GARMI is now active in a model apartment in Garmisch-Partenkirchen. The main field of research will be the further development of hands capable of performing even more refined tasks. It will be several years before GARMI is finally used in care homes. "It's like the autonomous car," says Naceri. "A lot of progress has already been made, but a few details are still missing before it is ready for the human environment."
Publications
- RETOM: Leveraging Maneuverability for Reactive Tool Manipulation using Wrench-Fields; Felix Eberle, Riddhiman Laha, Haowen Yao, Abdesdjallil Naceri, Luis F.C. Figueredo, Sami Haddadin; ICRA 2024
- Autonomous and Teleoperation Control of a Drawing Robot Avatar; Lingyun Chen, Abdeldjallil Naceri, Abdalla Swikir, Sandra Hirche, Sami Haddadin; ICRA 2024
- Safe-By-Design Digital Twins for Human-Robot Interaction: A Use Case for Humanoid Service Robots. Jon Škerlj*, Mazin Hamad*, Jean Elsner, Abdeldjallil Naceri, Sami Haddadin, ICRA 2024.
- Anthropomorphic Grasping with Neural Object Shape Completion; Diego Hidalgo-Carvajal,l Hanzhi Chen, Gemma C. Bettelani, Jaesug Jung, Melissa Zavaglia, Laura Busse, Abdeldjallil Naceri, Stefan Leuttenegger, Sami Haddadin; ICRA 2024
- Video: A Day with GARMI: Robot Home Assistance in the Lives of Senior Citizens; S. Bing, J. Skerlj, F. Eberle, A. Teimoorzadeh, X. Chen, M. Forouhar, H. Seghedian, A. Naceri, S. Haddadin; ICRA 2024; https://youtu.be/mL-50vtXHdo