Knowledge Technology at HRI '24
19 March 2024
Our group members Fares Abawi and Di Fu attended the HRI '24 conference in Boulder, CO, USA. (March 12-13 2024). They presented the following papers:
"Wrapyfi: A Python Wrapper for Integrating Robots, Sensors, and Applications across Multiple Middleware"
Authors: Fares Abawi, Philipp Allgeuer, Di Fu, and Stefan Wermter
Abstract: Message oriented and robotics middleware play an important role in facilitating robot control, abstracting complex functionality, and unifying communication patterns between sensors and devices. However, using multiple middleware frameworks presents a challenge in integrating different robots within a single system. To address this challenge, we present Wrapyfi, a Python wrapper supporting multiple message oriented and robotics middleware, including ZeroMQ, YARP, ROS, and ROS 2. Wrapyfi also provides plugins for exchanging deep learning framework data, without additional encoding or preprocessing steps. Using Wrapyfi eases the development of scripts that run on multiple machines, thereby enabling cross-platform communication and workload distribution. We finally present the three communication schemes that form the cornerstone of Wrapyfi's communication model, along with examples that demonstrate their applicability.
Doi: Link
Code of this paper can be found here.
"Human Impression of Humanoid Robots Mirroring Social Cues"
Authors: Di Fu, Fares Abawi, Philipp Allgeuer, and Stefan Wermter
Abstract: Mirroring non-verbal social cues such as affect or movement can enhance human-human and human-robot interactions in the real world. The robotic platforms and control methods also impact people's perception of human-robot interaction. However, limited studies have compared robot imitation across different platforms and control methods. Our research addresses this gap by conducting two experiments comparing people's perception of affective mirroring between the iCub and Pepper robots and movement mirroring between vision-based iCub control and Inertial Measurement Unit (IMU)-based iCub control. We discovered that the iCub robot was perceived as more humanlike than the Pepper robot when mirroring affect. A vision-based controlled iCub outperformed the IMU-based controlled one in the movement mirroring task. Our findings suggest that different robotic platforms impact people's perception of robots' mirroring during HRI. The control method also contributes to the robot's mirroring performance. Our work sheds light on the design and application of different humanoid robots in the real world.
Doi: Link
You can find the Youtube video of this paper here.