Toward Learning to Detect and Predict Contact Events on Vision-based Tactile Sensor
Published in Conference on Robot Learning (CoRL 2019), 2019
Recommended citation: Y. Zhang, W. Yuan, Z. Kan, M. Wang. "Toward Learning to Detect and Predict Contact Events on Vision-based Tactile Sensor." Conference on Robot Learning, CoRL 2019.
Abstract
In essence, successful grasp boils down to correct responses to multiple contact events between fingertips and object. In most scenarios, direct tactile sensing is adequate to distinguish contact events. For the nature of high dimensionality of tactile information, classifying spatiotemporal tactile signal using conventional model-based methods is difficult. In this work, we propose to predict and classify tactile signal using deep learning method, seeking to enhance the adaptability of the robotic grasp system to external event changes that may lead to grasping failure. We develop a deep learning framework and collect over 3000 trials of tactile image sequence with vision-based tactile sensor, and the neural network is integrated into a contact event-based robotic grasping system. In grasping experiments, we achieved 48\% increase in terms of object lifting success rate with contact detection, significantly higher robustness under unexpected loads with slip prediction compared with open loop grasps, demonstrating that integration of the proposed framework into robotic grasping system substantially improves picking success possibility and capability to withstand external disturbances.