Neuroinformatics Group

Universität BielefeldTechnische FakultätNI

4

From action capture to a database of physics-based manual interaction

While language provides us with a concise code capturing much of the movement complexity of our mouth, we still lack a comparable representation for the movement of our hands. This project aims to create a database of human hand interaction patterns from a variety of multimodal data sources. An associated goal is to develop methods for the clustering of captured trajectory data into physics-based models of manual interaction. We hope that the resulting database can make a contribution towards a better grounding of control strategies for anthropomorphic robot hands and develop for robotics a similar utility as the WordNet database has for linguistics.

read more »

Augmented Reality based Brain-Computer Interfaces

For a long time Brain-Computer Interfaces (BCI) had been destined to act as pur spelling devices which enabled paralyzed people to communicate by mere thought. Our current projects aim to extend the scope of these devices and develop novel techniques for brain-robot interaction. A successful application of BCIs to robotic devices will have the tremendous advantage that the users will not be limited to pure communication tasks but also be able to manipulate their surrounding directly by only imagining actions.

read more »

Manipulating Paper

Manipulation of paper is a rich domain of manual intelligence that we encounter in many daily tasks. The present project attempts to analyse and implement the "web" of visuo-motor coordination skills to endow an anthropomorphic robot hand with the ability to manipulate paper (and paper-like objects) in a variety of situations of increasing complexity. This will include aspects such as modeling interaction with compliant objects, action based representation as well as bimanual coordination to enable object transformations such as tearing and folding.

read more »

Gestalt Learning as a Basis for Adaptive Alignment

CLM What principles enable rapid and adaptive alignment in coordination?

This project investigates Gestalt principles and their generalization from the perceptual into the action/cooperation domain for modeling adaptive alignment and its functional replication in human-robot cooperation. Departing from learning algorithms for dynamic Gestalt formation in layered recurrent networks (Competitive Layer Model CLM), we develop a hybrid, hierarchical architecture for adaptive alignment in cooperation that integrates elements from connectionist and symbol-based representations. We evaluate its performance in a human-robot cooperation scenario involving two anthropomorphic hands mounted on a bimanual robot platform.

read more »

Co-evolution of neural and morphological development for grasping

The goal of this project is to investigate the principles underlying co-evolution of a body shape and its neural controller. As a specific model system, we consider a robot hand that is controlled by a neural network. In contrast to existing work, we focus on the genetic regulation of neural circuits and morphological development. Our interest is directed at a better understanding of the facilitatory potential of co-evolution for the emergence of complex new functions, the interplay between development and evolution, the response of different genetic architectures to changing environments, as well as the role of important boundary constraints, such as wiring and tissue costs.

read more »

Learning Control Behaviour within the Control Basis Framework

The Control Basis Framework (by Grupen et. al. 1998) is a powerful approach to closed loop control. This project aims at providing a library implementing the Control Basis Framework idea and possibly extending it to concurrent execution. Additional research is planned to investigate how to make machines learn to utilize the control affordances provided by synthesized controllers. read more »