Neuroinformatics Group

Universität BielefeldTechnische FakultätNI

CITEC

What does a hand-over tell? - Comparative analysis of kinematic data

Which relations exist between properties of animals or people and their kinematic patterns? For example, can we tell, who performed a hand-over of which kind of object under which conditions just by looking at the sequence of joint angles? We try to find answers to these questions by employing a 3D motion tracking system.

read more »

CITEC SummerSchool 2011 - Module "Manual Action and Intelligence for Robots"

Based on findings of cognitive motor science, how humans represent grasping and more complex movement sequences, the goal of this modul is to introduce approaches to robotic realization of those concepts.

read more »

Sensory-motor representations & error learning - experimental analysis of manual intelligence in first order & virtual reality

One central issue for the cognitive control of movement is the compensation of errors and learning processes that enhance error compensation mechanisms. This is especially true for very precise movements such as many manual actions. The present project combines methods and conventional experimental settings (first order reality) with approaches from Virtual Reality and Augmented Reality to embed subjects in interaction loops in which the occurrence and perception of errors can be manipulated and studied in novel ways. In this way we hope to gain new clues about error correction mechanisms, error compensation learning and their replication in technical systems such as robots. read more »

From Cognitive Representation to Technical Synthesis of Manual Action

What insights can we gain from psychological measurements of biomechanical parameters and subjective judgements of manual actions (like object grasping) about the structures of the underlying cognitive representations? In this project, we will bring together statistical methods (like structure dimensional and principal components analysis) with connectionist approaches employing artificial neural networks to test different hypotheses about the cognitive structure of manual actions. A major goal will be to emulate and control grasping behavior for a broad range of objects in kinematic simulations and - as a longer term objective - in real physics on a robot platform. read more »

From action capture to a database of physics-based manual interaction

While language provides us with a concise code capturing much of the movement complexity of our mouth, we still lack a comparable representation for the movement of our hands. This project aims to create a database of human hand interaction patterns from a variety of multimodal data sources. An associated goal is to develop methods for the clustering of captured trajectory data into physics-based models of manual interaction. We hope that the resulting database can make a contribution towards a better grounding of control strategies for anthropomorphic robot hands and develop for robotics a similar utility as the WordNet database has for linguistics.

read more »

Learning Control Behaviour within the Control Basis Framework

The Control Basis Framework (by Grupen et. al. 1998) is a powerful approach to closed loop control. This project aims at providing a library implementing the Control Basis Framework idea and possibly extending it to concurrent execution. Additional research is planned to investigate how to make machines learn to utilize the control affordances provided by synthesized controllers. read more »