Neuroinformatics Group

Universität BielefeldTechnische FakultätNI

Robots Exploring their Bodies Autonomously (REBA)

Motivated by the need of complex anthropomorphic robots to manage sophisticated spatial relationships between parts of their body and the environment, together with recent findings from the neurosciences about how the brain solves this challenge by means of a highly adaptable “body schema”, the present proposal pursues the goal of endowing an anthropomorphic robots with the ability to adaptively form and maintain a highly flexible body schema.
 
We will use stateof-the-art anthropomorphic robot platforms to explore and analyze representations and learning algorithms for synthesizing body schemas at increasing levels of sophistication. A central overarching element of the approach is that learning takes the form of a self-exploration of the robots sensory-body apparatus and, therefore, can proceed fully autonomously.

 

Essential parts of our proposal will be:

  • statistical and geometrical learning algorithms specialized to kinematic learning,
  • an hierarchal decomposition of the body schema that allows simulated development by sequentially learning less complex parts,
  • the multimodal inclusion of senses such as vision and haptics and proprioceptive sensors such as joint positions and dynamics,
  • and exploration strategies that are driven by simulated curiosity and mimic human behaviour.

 

Topic1. Tactile Servoing framework

 

Introduction: 

The advent of sensor arrays providing tactile feedback with high spatial and temporal resolution asks for new control strategies to exploit this important and valuable sensory channel for grasping and manipulation tasks.

In this topic, we introduce a control framework to realize a whole set of tactile servoing tasks, i.e. control tasks that intend to realize a specific tactile interaction pattern. This includes such
simple tasks like tracking a touched object, maintaining both contact location and contact force, as well as more elaborate tasks like tracking an object’s pose or tactile object exploration.
Exploiting methods known from image processing, we introduce robust feature extraction methods to estimate the 2D contact position, the contact force, and the orientation of an object edge
being in contact to the sensor. The flexible control framework allows us to adapt the PID-type controller to a large range of different tasks by specification of a projection matrix toggling certain control components on and off. 

We demonstrate and evaluate the capabilities of the proposed control framework in a series of experiments employing a 16X16 tactile sensor array attached to a Kuka LWR as a large fingertip
and iCub platform.
 

Experiment: 

  • Video link: 

www.youtube.com/watch

  • related paper link: 

Qiang Li, CarstenSchürmann, Robert Haschke, Helge Ritter,  "A control framework for tactile servoing", Oral presentation by Robotics: Science and Systems 2013

 

 

 

Topic2. Visuo-tactile servoing framework

 

Introduction:


We present a novel hierarchical control framework that unifies our previous work on tactile-servoing with visual-servoing approaches to allow for robust manipulation and exploration of unknown objects, including – but not limited to – robust grasping, online grasp optimization, in-hand manipulation, and exploration of object surfaces. The control framework is divided into three layers: a joint-level position control layer, a tactile servoing control layer, and a high-level visual servoing control layer. While the middle layer provides “blind” surface exploration skills, maintaining desired contact patterns, the visual layer monitors and controls the actual object pose providing high-level finger-tip motion commands that are merged with the tactile-servoing control commands.
Because the high spatial resolution tactile array and tactile servoing method is used, the robot end-effector can actively perform slide, roll and twist motion in order to improve the contact quality with the unknown object only depending on the tactile feedback. 

Our control method can be consider as another alternative option for vision-force shared control method and vision-force-tactile control method which heavily depend on the 3D force/torque sensor to perform end-effector fine manipulation after the contact happening.

We illustrate the efficiency of the proposed framework using a series of manipulation actions performed with two KUKA LWR arms equipped with a tactile sensor array as a “sensitive fingertip”. The two considered objects are unknown to the robot, i.e. neither shape nor friction properties are available.
 

Experiment:

  • Video link:

 www.youtube.com/watch

  • related paper

Qiang Li,  Robert Haschke, Helge Ritter, "A Visuo-Tactile Control Framework for Manipulation and Exploration of Unknown Objects", IEEE Humanoids 2015

 

 

Topic3: Body schema learning based on the visuo-tactile servoing

Introduction: 


Striving for an autonomous self-exploration of robots to learn their own body schema, i.e. body shape and appearance, kinematic and dynamic parameters, association of tactile stimuli to specific body locations, etc., we developed a tactile-servoing feedback controller that allows a robot to continuously acquire self-touch information while sliding a fingertip across its own body. In this manner one can quickly acquire a large amount of training data representing the body shape.

We compare three approaches to track the common contact point observed when one robot arm is touching the other in a bimanual setup: feedforward control, solely relying on a coarse
CAD-based kinematics performs worst, a solely feedback-based controller typically lacks behind, and only the combination of both approaches yields satisfactory tracking results.

As a first, preliminary application, we use this self-touch capability to calibrate the closed kinematic chain formed by both arms touching each other. The obtained homogeneous transform describing the relative mounting pose of both arms improves end-effector position estimations by a magnitude
 
 
Experiment:
 
  • video link
  • related paper
Qiang Li,  Robert Haschke, Helge Ritter,  "Towards Body Schema Learning using Training Data Acquired by Continuous Self-touch", IEEE Humanoids 2015

 

 

Related People

qli's picture
Li, QiangContact
rhaschke's picture
Haschke, RobertSupervisor
helge's picture
Ritter, HelgeSupervisor