Neuroinformatics Group

Universität BielefeldTechnische FakultätNI

CoR-Lab

Multimodal times series modeling and segmentation

Twister In this project we investigate approaches for unsupervised segmentation of interaction sequences based on multimodal data. The proposed procedure estimates segment borders across all modalities in a single pass.

read more »

A Brain-Robot Interface for Controlling ASIMO

Acquiring a profound knowledge about the cognitive processes underlying human-robot interaction is essential to better exploit the measurable components for brain-robot interfaces. The better the processes are understood, the better the EEG components originating from these processes can be used. A systematic evaluation of these components in connection with human-robot interaction is missing until today. Hence, it appears to be worthwhile to take a closer and impartial look at what is really happening on the cognitive level, as far as determinable by EEG signals.

read more »

Autonomous Exploration of Manual Interaction Space

We gradually increase our manual competence by exploring manual interaction spaces for many different kinds of objects. This is an active process that is very different from passive perception of "samples". The availability of humanoid robot hands offers the opportunity to investigate different strategies for such active exploration in realistic settings. In the present project, the investigation of such strategies shall be pursued from the perspective of „multimodal proprioception:“ correlating joint angles, partial contact information from touch sensors and joint torques as well as visual information about changes in finger and object position in such a way as to make predictions about "useful aspects" for shaping the ongoing interaction.

read more »

Co-evolution of neural and morphological development for grasping

The goal of this project is to investigate the principles underlying co-evolution of a body shape and its neural controller. As a specific model system, we consider a robot hand that is controlled by a neural network. In contrast to existing work, we focus on the genetic regulation of neural circuits and morphological development. Our interest is directed at a better understanding of the facilitatory potential of co-evolution for the emergence of complex new functions, the interplay between development and evolution, the response of different genetic architectures to changing environments, as well as the role of important boundary constraints, such as wiring and tissue costs.

read more »

NEATfields: Evolution of large neural networks

In the last decades, many researchers have used evolutionary algorithms to adapt the topology and connection weights of recurrent neural networks for various control tasks. This has become a useful machine learning technique. Because handling large genomes is difficult, however, these neural networks typically contain only a few neurons. If the genome contains a recipe for construction of the network instead of the network itself, it can be much smaller. We have developed a method than can exactly do this, and performs very well on a number of different problems.

read more »

Vision-based Grasping

Unlike most existing approaches to the grasp selection task for anthropomorphic robot hands, this vision-based project aims for a solution, which does not depend on an a-priori known 3D shape of the object. Instead it uses a decomposition of the object view (obtained from mono or stereo cameras) into local, grasping-relevant shape primitives, whose optimal grasp type and approach direction are known or learned beforehand. Based on this decomposition a list of possible grasps can be generated and ordered according to the anticipated overall grasp quality.

read more »