Effective estimation of Contact force and torque using vision-based sensors.

In this project, we propose a method that utilizes Helmholtz-Hodge Decomposition (HHD) to decompose displacement field of the soft skin when making contact with objects. With the decomposition, we obtain curl-free, divergence-free and harmony fields. Governed by multiple constraints, surface forces including tangential and normal forces, and torsion are computed by a model that is trained on a small size dataset with a supervised learning fashion. In comparison, our decomposition results in a more data-efficient computation pipeline, which is superior considering the difficulties in collecting huge amount of contact dataset.


Toward Learning to detect and predict Contact events on vision-based tactile sensors.

In essence, successful grasp boils down to correct responses to multiple contact events between fingertips and object. In most scenarios, direct tactile sensing is adequate to distinguish contact events. For the nature of high dimensionality of tactile information, classifying spatiotemporal tactile signal using conventional model-based methods is difficult. In this work, we propose to predict and classify tactile signal using deep learning method, seeking to enhance the adaptability of the robotic grasp system to external event changes that may lead to grasping failure. We develop a deep learning framework and collect over 3000 trials of tactile image sequence with vision-based tactile sensor, and the neural network is integrated into a contact event-based robotic grasping system. In grasping experiments, we achieved 48\% increase in terms of object lifting success rate with contact detection, significantly higher robustness under unexpected loads with slip prediction compared with open loop grasps, demonstrating that integration of the proposed framework into robotic grasping system substantially improves picking success possibility and capability to withstand external disturbances.

Hybrid Jamming for Bio-inspired Soft Robotic Fingers

This work presents a novel design of bio-inspired soft robotic fingers based upon hybrid jamming principle – integrated layer jamming and particle jamming. The finger combines a fiber-reinforced soft pneumatic actuator with a hybrid jamming substrate. Taking advantage of different characteristics of layer jamming and particle jamming, the substrate is designed with three chambers filled with layers (function as bones) and two chambers filled with particles (function as joints). The layer regions and particle regions are interlocked with each other to guarantee load transfer from the fixed finger end to fingertip. With the proposed design, the finger is endowed with bending shape control as well as variable stiffness capabilities. Theoretical analysis is conducted to predict the stiffness variation of the proposed finger at different vacuum levels and experimental tests are performed to evaluate the finger’s shape control and stiffness tuning effectiveness. Experimental results show that the proposed finger can achieve 5.52 times stiffness enhancement at primary position. Lastly, we fabricate a gripper and perform grasping demonstrations on several objects. Results show that the gripper is able to transfer between low stiffness state for adaptive grasping and high stiffness state for robust holding.