AnyRotate: Gravity Invariant In-Hand Object Rotation with Sim-to-Real Touch

University of Bristol
Conference on Robot Learning (CoRL) 2024

Multi-axis in-hand object rotation invariant to gravity direction using dense featured sim-to-real touch.

Abstract

In-hand manipulation is an integral component of human dexterity. Our hands rely on tactile feedback for stable and reactive motions to ensure objects do not slip away unintentionally during manipulation. For a robot hand, this level of dexterity requires extracting and utilizing rich contact information for precise motor control. In this paper we present AnyRotate, a system for gravity-invariant multi-axis in-hand object rotation using dense featured sim-to-real touch. Our policy is trained in simulation with rich tactile feedback and zero-shot transferred to the real-world. We introduce an approach for bridging the sim-to-real gap for a dense tactile representation. Our experiments highlight the benefit of detailed contact information when handling objects with varying properties. In the real world, we demonstrate a successful transfer of the dense tactile policy, generalizing to a diverse range of objects for various rotation axes and hand directions, outperforming other forms of low-dimensional touch. Interestingly, despite not having explicit slip detection, rich multi-fingered tactile sensing can implicitly detect object movement within grasp and provide a reactive behavior that improves the robustness of the policy, highlighting the importance of information-rich tactile sensing for in-hand manipulation.

Hardware

Interpolate start reference image.

We Use the allegro hand (16 DoF) with tactile sensors attached on the fingertips.

Tactile Sim-to-Real via Feature Extraction

Interpolate start reference image.

We train observation models to bridge the tactile sim-to-real gap. Our observation models are trained on contact data of the sensor on a F/T sensor. We move the tactile sensor on a flat stimulus mounted above the F/T sensor, and collect tactile images along with the corresponding robot poses and contact forces as labels. We then train a CNN to extract the contact features that are used in simulation. The dense tactile features we consider are continous readings of contact pose and contact force.

A visualization of the dense tactile representation used for in-hand object rotation. The center of the shaded region on the dome represents the contact pose and the size of the shaded region represents the force.

Unified policy for any chosen rotation axis

Generalizing to unseen objects

Robust to grasp disturbances

Robust of changing gravity directions

Related Articles

The future lies in a pair of tactile hands

Nathan F. Lepora

Science Robotics 9 (91)

Article

Sim-to-Real Model-Based and Model-Free Deep Reinforcement Learning for Tactile Pushing

Max Yang, Yijiong Lin, Alex Church, John Lloyd, Dandan Zhang, David A.W. Barton, Nathan F. Lepora

IEEE Robotics and Automation Letters (RA-L) 2023

Paper / arXiv / Project Page