Tactile dual-arm robot achieves bimanual tasks using AI

An innovative bimanual robot displays tactile sensitivity close to human-level dexterity using AI.

Share

Bimanual robotic manipulation with tactile feedback will be key to human-level robot dexterity. It is a useful and natural way of manipulating larger or more difficult objects due to the better maneuverability, flexibility, and a larger workspace compared to the single-arm setting. However, it is less explored than single-arm settings, partly due to the availability of suitable hardware along with the complexity of designing effective controllers for tasks with relatively large state-action spaces.

Researchers at the University of Bristol have developed a new Bi-Touch system that allows robots to carry out manual tasks by sensing what to do from a digital helper.

The research shows how an AI agent interprets its environment through tactile and proprioceptive feedback and then controls the robots’ behaviors, enabling precise sensing, gentle interaction, and effective object manipulation to accomplish robotic tasks.

“With our Bi-Touch system, we can easily train AI agents in a virtual world within a couple of hours to achieve bimanual tasks that are tailored towards the touch. And more importantly, we can directly apply these agents from the virtual world to the real world without further training,” lead author Yijiong Lin from the Faculty of Engineering explained. “The tactile bimanual agent can solve tasks even under unexpected perturbations and manipulate delicate objects in a gentle way.”

The tactile dual-arm robotic system was designed using recent AI and robotic tactile sensing advances. Researchers constructed a virtual world (simulation) containing two robot arms with tactile sensors. They then design reward functions and a goal-update mechanism that could encourage the robot agents to learn to achieve the bimanual tasks.

In addition, the robot utilizes Deep Reinforcement Learning (Deep-RL) to learn bimanual skills. It enables robots to make decisions, discover effective ways to perform tasks and learn from trial and error, akin to training a dog with rewards and punishments.

When the robot succeeds in performing a task, it gets a reward, and when it fails, it learns what not to do. The AI agent is visually blind, relying only on proprioceptive feedback – a body’s ability to sense movement, action, location, and tactile feedback.

“Our Bi-Touch system showcases a promising approach with affordable software and hardware for learning bimanual behaviors with touch in simulation, which can be directly applied to the real world,” co-author Professor Nathan Lepora said in a statement. “Our developed tactile dual-arm robot simulation allows further research on more different tasks as the code will be open-source, which is ideal for developing other downstream tasks.”

The new development could revolutionize industries such as fruit picking and domestic service and eventually recreate touch in artificial limbs.

“Our Bi-Touch system allows a tactile dual-arm robot to learn sorely from simulation and to achieve various manipulation tasks in a gentle way in the real world,” Yijiong concluded. “And now we can easily train AI agents in a virtual world within a couple of hours to achieve bimanual tasks that are tailored towards the touch.”

Journal reference:

  1. Yijiong Lin, Alex Church, Max Yang, Haoran Li, John Lloyd, Dandan Zhang, Nathan F. Lepora. Bi-Touch: Bimanual Tactile Manipulation With Sim-to-Real Deep Reinforcement Learning. IEEE Robotics and Automation Letters, 2023; DOI: 10.1109/LRA.2023.3295991

Trending