Dataset Name: UNIPI Dataset (September 2013)
Research Groups: UNIPI
Hand Type: Human Hand
Data Type: Human Motion, Human Postures
Data Structure: Joint Angles (rad)
Data Format: .mat
Action Type: Reach and Grasp
Sampling Rate: >100 Hz (480 Hz)
Objects Type: Imagined Objects
Kin. Model # DoFs: >20 (24)
Equipment: Motion Capture System -> Phase Space
# of Actions/Hand Configurations: >20 (3694)
# of Subjects: 1
Year: 2013
Dataset Information:
Data contain the postural angles of the human hand of one female right handed subject performing grasping actions over a number of imagined objects. The images of the objects were displayed to the subject and these images are contained in the submission material. The subject, comfortably seated, with the flat hand on the leg, was asked to move the hand as to grasp the imagined object in front of her, and then to come back to the rest position. The experimental protocol, markerization and techniques for angle estimation are described in [1]. The submitted material is organised as it follows: – in the folder \images_object\ the images of the objects displayed to subjects are reported; in the folder \object_ordered\, the cell \files_sorted\ contains the order the objects were presented to the subject; the folder \postural_angles\ contains the estimated angles of the human hand, after having filtered them via a mobile average filter. These data comprise the grasp postures related to the imagined objects presented in the order described in the folder \object_ordered\. Data contain also pre-grasp phases and the movements from and back to the rest position (flat hand on the subject leg); finally, in the folder \Additional_Material_Video\ the video of the reconstructed postures is reported.
How to Cite:
[1] M. Gabiccini, G. Stillfried, H. Marino, M. Bianchi. A data-driven kinematic model of the human hand with soft-tissue artifact compensation mechanism for grasp synergy analysis. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2013. Tokyo, Japan; In Press.
Additional References:
[2] G. Stillfried, U. Hillenbrand, M. Settles, and P. van der Smagt. MRI-based skeletal hand movement model. In R. Balaraman and V. Santos, editors, The human hand – a source of inspiration for robotic hands. Springer Tracts on Advanced Robotics, 2013. In print.
[3] M. Santello, M. Flanders, and J. F. Soechting. Postural hand synergies for tool use. The Journal of Neuroscience, 18(23):10105-10115, 1998.
Related Video: