Neuroinformatics Group

Universität BielefeldTechnische FakultätNI

Dextrous Manipulation

During the last decades, researchers and engineers have made huge advances in constructing and building anthropomorphic robot hands which have become more and more sophisticated. Together with these developments, we are facing the question of how to dexterously control such complex robots with up to 20 degrees of freedom in up to five fingers and a wrist. Implementing fixed grasp and manipulation programs does not lead to satisfying results as it is very time consuming on the one hand and not generalisable to differences in the grasping or manipulation situation.

Dextrous Grasping

In the domain of dextrous grasping, most of the published approaches focus on one of the two most obvious aspects of the grasping task - either the object geometry or the tactile impressions during the grasp. Concerning the geometry aspect, several approaches have been presented to incorporate explicit object geometry models to calculate (optimal) contact points and plan grasp postures to realise them. On the other side, the grasping approaches involving tactile information do not plan beforehand, but close the fingers around the object solely based on the tactile feedback of the hand until stable object contact is detected.

In recent years, alternative ways of thinking about grasping have been presented that aim at finding less complex representations of the grasping problem. To this end, eigengrasps have been taken into account or lower-dimensional manifolds embedded in the hand posture space are considered.

The Grasp Manifold

In the latter domain, we presented a new approach to dextrous robot grasping that combines the advantages of geometry-based and tactile-driven grasping employing an implicit representation of grasping experience using a self-organising map (SOM). The SOM lattice is trained with previously recorded hand postures which led to successful grasps. In this manner, it forms a discrete approximation of a smooth Grasp Manifold representing the subspace of the hand joint angle space described by the training data and thus by the set of known grasp postures. Using tactile information to infer implicit knowledge about the object position and shape, the algorithm dynamically exploits the SOM to adapt the grasping motion to the actual situation. According to observed finger contacts, the most suitable hand posture is selected from the grasp manifold represented by the SOM nodes' reference vectors.

The Manipulation Manifold

Starting from this work on dexterous grasping, we are currently working on a related representation of manipulation movements or dexterous manipulation, respectively. In our work, we propagate a manifold representation of such movements recorded from human demonstration. The main idea is to construct manifolds embedded in the finger joint angle space which represent the subspace of hand postures associated with a specific manipulation movement. In contrast to our grasping approach, we thus aim at representing sequences of movements instead of representing final grasp postures in the manifold. In addition, we want to construct these manifolds such that specific movement parameters - and especially the advance in time - are explicitly represented by specific and distinct manifold dimensions. Once successfully generated, such manifold allows for very easy and deliberative navigation within the manifold.

Turning a Bottle Cap

For our initial experiments, we focus on the manipulation movement of turning a bottle cap. Here, we incorporate the advance in time and the cap radius as manipulation parameters. The training data consist of a set of vectors of finger joint angles generated in a physics-based simulation using a data glove as input device. As initial step towards this scenario, we presented the manifold construction using the Unsupervised Kernel Regression (UKR) and the way of applying it for manipulation in a physics-based simulation. For a short demonstration, please visit the (external) movie page.

Related publications