Wiki

Organisation CITEC -- Center of Excellence Cognitive Interaction Technology, Bielefeld University
Projects MINDA, CORTESMA
Projects Description MINDA is creating incrementally growing database of manual interactions to help put manual intelligence research on a firmer empirical bases. This involves the study of manual interactions in humans using a multi-sensing approach. The database contains: geometry information, tactile sensor information, vision information and sound information. Using these multimodal information sources allows us to build models that can aid robots to carry out complex tasks of the type that humans perform with ease.
CORTESMA investigated hand kinematics and mental representations of grasping movements directed towards real and virtual spherical objects systematically varying in size. Results suggest that grasping movements are influenced by object size at an early stage of the movement for real and virtual objects. The analyses of mental representations (via SDA) and of motor synergies (via PCA) reveal a separation of the smallest three objects from the larger ones, pointing towards a conceptual influence on the grasping movement.
Title Kinematic data of grasping movements directed towards virtual and real objects
Version v1.0.0
Date 2011-07-20
Creators Jonathan Maycock, Bettina Bläsing, Till Bockemühl, Helge Ritter, Thomas Schack
Contributors -
License MINDA, CORTESMA: 'Kinematic data of grasping movements directed towards virtual
and real objects' Copyright(c) Jonathan Maycock, Bettina Bläsing, Till Bockemühl,
Helge Ritter and Thomas Schack http://opendata.cit-ec.de/projects/virtual-and-real-grasping
There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
The development of this database was supported by the Excellence Cluster EXC 277 Cognitive
Interaction Technology. The Excellence Cluster EXC 277 is a grant of the Deutsche
Forschungsgemeinschaft (DFG) in the context of the German Excellence Initiative. 'Kinematic data of grasping movements directed towards virtual and real objects' is made available under the Open Database License: http://opendatacommons.org/licenses/odbl/1.0/. Any rights in individual contents of the database are licensed under the Database Contents License: http://opendatacommons.org/licenses/dbcl/1.0/
Description Eleven right handed subjects (age: 24-39 years, 4 women) participated in a series of three experiments. All subjects had normal or corrected-to-normal vision and had no known impairments related to arm or hand movement. All subjects gave written informed consent to be part of the study. The experiment was carried out according to the principles laid out in the 1964 Declaration of Helsinki. Subjects performed all three experiments in the same order, starting with Experiment 1, directly followed by Experiment 2, and then Experiment 3.
The experiments were carried out at the Manual Intelligence Lab, making use of its sophisticated multimodal set-up for investigating manual interaction (Maycock et al., 2010). During the data collection, the subjects stood in front of a table (with dimensions 210 x 130 x 100 cm). Subjects wore an Immersion CyberGlove II wireless data glove (Immersion Corp., San Jose, CA; data acquisition rate: 100Hz; sensor resolution: <1°) on the right hand that allowed for the recording of whole hand kinematics (22 DOF). In front of the subject (at a distance of 40cm), a holding device for spherical objects (golf tee) was positioned on the table. A laptop computer screen was positioned behind the holding device. A small round bowl (10cm in diameter) located 40cm to the right of the holding device served as target for placing the objects. A 14 camera Vicon digital optical motion capture system (Vicon, Los Angeles, CA) mounted around the table was used to monitor the trajectories of the hand movements via three retro-reflective markers placed on the back of the data glove
Keywords manual action, grasping, hand kinematics, virtual objects, motor synergies, mental representation of movement
Structure Time series of joint angles and hand positions. Note that the Cyberglove captures at approximately 90hz and the Vicon system at exactly 200hz.
Format XML, CSV, C3D
Relation Maycock J, Bläsing B, Bockemühl T, Ritter H, Schack T (2010) Motor synergies and object representations in virtual and real grasping. In: 1st International Conference on Applied Bionics and Biomechanics (ICABB). Venice, Italy: IEEE.
Acknoledgements The development of this database was supported by the Excellence Cluster EXC 277 Cognitive Interaction Technology. The Excellence Cluster EXC 277 is a grant of the Deutsche Forschungsgemeinschaft (DFG) in the context of the German Excellence Initiative.

KinematicGraspingMovements.xml Magnifier - XML-based description (6.782 KB) Florian Lier, 2011-09-02 14:05

KinematicGraspingMovements.xml Magnifier - V2 (6.782 KB) Florian Lier, 2013-02-04 16:57