Dataset Name: UNIPI Dataset (June 2014)
Research Groups: UNIPI
Hand Type: Human Hand
Data Type: Human Motion
Data Structure: Marker Coordinates (mm)
Data Format: .dat
Sampling Rate: >100 Hz (480 Hz)
Action Type: Free Space
Objects Type: No Object
Kin. Model # DoFs: >20 (24)
Equipment: Motion Capture System -> Phase Space
# of Actions/Hand Configurations: >20
# of Subjects: 1
Year: 2014
Dataset Information:
Dataset contains the Kapandji kinematic model for one subject. The file contains row-data acquired using the Phase Space Optical Tracking System. Data are organized in blocks: each block corresponds to the marker coordinates for a given sample time. More information regarding the model can be found in the .pdf and .ppt files.
How to cite:
[1] M. Gabiccini, G. Stillfried, H. Marino, M. Bianchi. A data-driven kinematic model of the human hand with soft-tissue artifact compensation mechanism for grasp synergy analysis. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2013. Tokyo, Japan; In Press.
Additional References:
[2] G. Stillfried, U. Hillenbrand, M. Settles, and P. van der Smagt. MRI-based skeletal hand movement model. In R. Balaraman and V. Santos, editors, The human hand – a source of inspiration for robotic hands. Springer Tracts on Advanced Robotics, 2013. In print.
[3] M. Santello, M. Flanders, and J. F. Soechting. Postural hand synergies for tool use. The Journal of Neuroscience, 18(23):10105-10115, 1998.