Santosh Thoduka, Nico Hochgeschwender, Juergen Gall and Paul G. Plöger
Abstract: An object handover between a robot and a human is a coordinated action which is prone to failure for reasons such as miscommunication, incorrect actions and unexpected object properties. Existing works on handover failure detection and prevention focus on preventing failures due to object slip or external disturbances. However, there is a lack of datasets and evaluation methods that consider unpreventable failures caused by the human participant. To address this deficit, we present the multimodal Handover Failure Detection dataset, which consists of failures induced by the human participant, such as ignoring the robot or not releasing the object. We also present two baseline methods for handover failure detection: (i) a video classification method using 3D CNNs and (ii) a temporal action segmentation approach which jointly classifies the human action, robot action and overall outcome of the action. The results show that video is an important modality, but using force-torque data and gripper position help improve failure detection and action segmentation accuracy.
Please cite the paper as follows:
@inproceedings{thoduka2024_icra,
author = {Thoduka, Santosh and Hochgeschwender, Nico and Gall, Juergen and Pl\"{o}ger, Paul G.},
title = {{A Multimodal Handover Failure Detection Dataset and Baselines}},
booktitle={2024 IEEE International Conference on Robotics and Automation (ICRA)},
year = {2024},
pages={17013-17019},
doi={10.1109/ICRA57147.2024.10610143}
}