New Equipment: Eye Tracker
14 April 2022

Photo: UHH/Knowledge Technology
The eye tracker will be used to record human eye movements, including fixations, scanpaths, and pupil dilations, to train attention prediction computational models and utilized in the Human-Robot Interaction studies. The work with the eye tracker will contribute to the "Cross-Modal Learning" (CML) project A5: Neurorobotic models for crossmodal joint attention and social interaction. Please see our recent work in this direction:
- Fu, D., Abawi, F., Carneiro, H., Kerzel, M., Chen, Z., Strahl, E., Liu, X., Wermter, S. (2022). A trained humanoid robot can perform human-like crossmodal social attention conflict resolution. arXiv preprint arXiv:2111.01906. (Video Demo)
- Abawi, F., Weber, T., Wermter, S.: GASP: Gated attention for saliency prediction. In Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence, IJCAI-21, pp. 584–591. International Joint Conferences on Artificial Intelligence Organization (2021). (Video Demo)
- Carneiro, H., Weber, C., Wermter, S.: FaVoA: Face-Voice association favours ambiguous speaker detection. In Proceedings of the 30th International Conference on Artificial Neural Networks (ICANN 2021), Lecture Notes in Computer Science, vol. LNCS 12891, pp. 439–450. ENNS, Springer Nature (2021).