2_Student Works With External Awards >
Student Works With External Awards >
Please use this identifier to cite or link to this item:
|Title: ||Emulating human perception of motion similarity|
|Authors: ||Tang, Jeff Kai Tai (鄧啟泰)|
Shum, Hubert P. H.
|Department: ||Department of Computer Science (Tang, K. T.; Leung, H.); School of Informatics at University of Edinburgh (Komura Taku; Shum P. H.)|
|Issue Date: ||Sep-2008|
|Award: ||Won the Best Paper (Runner-up) Award in International Conference on Computer Animation and Social Agents 2008 (CASA 2008)organized by Graduate School of Culture Technology, Korea Advanced Institute of Science and Technology (KAIST) and Computer Graphics Society|
|Subjects: ||3D human motion similarity|
|Abstract: ||Evaluating the similarity of motions is useful for motion retrieval, motion blending, and performance analysis of dancers and athletes. Euclidean distance between corresponding joints has been widely adopted in measuring similarity of postures and hence motions.
However, such a measure does not necessarily conform to the human perception of motion
similarity. In this paper, we propose a new similarity measure based on machine learning
techniques.We make use of the results of questionnaires from subjects answering whether arbitrary pairs of motions appear similar or not. Using the relative distance between the joints as the basic features, we train the system to compute the similarity of arbitrary pair of motions. Experimental results show that our method outperforms methods based on Euclidean distance between corresponding joints. Our method is applicable to content-based motion retrieval of human motion for large-scale database systems. It is also applicable to e-Learning systems which automatically evaluates the erformance of dancers and athletes by comparing the subjects’ motions with those by experts.|
|Remarks: ||The Institutional Repository only contains the News announcement|
|Appears in Collections:||Student Works With External Awards|
Items in CityU IR are protected by copyright, with all rights reserved, unless otherwise indicated.