City University of Hong Kong
DSpace
 

CityU Institutional Repository >
3_CityU Electronic Theses and Dissertations >
ETD - Dept. of Computer Science  >
CS - Doctor of Philosophy  >

Please use this identifier to cite or link to this item: http://hdl.handle.net/2031/3872

Title: Automatic generation of hand action codes (scripts) from key frame images for hand gesture synthesis and animation
Other Titles: Cong guan jian zheng zhong zi dong chan sheng shou bu dong zuo bian ma (wen ben) bing jin xing shou shi he cheng he dong hua xian shi
從關鍵幀中自動産生手部動作編碼 (文本) 並進行手勢合成和動畫顯示
Authors: Lam, Maria Sau Wai (林秀慧)
Department: Dept. of Computer Science
Degree: Doctor of Philosophy
Issue Date: 2002
Publisher: City University of Hong Kong
Subjects: Computer animation
Human locomotion -- Computer simulation
Notes: CityU Call Number: QA76.9.C65 L36 2002
Includes bibliographical references (leaves 174-184)
Thesis (Ph.D.)--City University of Hong Kong, 2002
x, 184 leaves : ill. (some col.) ; 30 cm.
Type: Thesis
Abstract: Hand gesture interpretation and hand motion animation is receiving increasing attention in both the computer vision and the computer graphics communities due to their potential applications in advanced man-machine interface and content production in multimedia entertainment. The human hand is capable of a large variety of functions, from pointing at objects and grasping objects of various shapes to tactile exploration, expressing our feelings and communicating with others. In this work, three major processes are involved to visualize natural hand movement; they are motion data acquisition and analysis, data conversion into motion parameters, and motion generation carried out by a synthetic hand model. For hand modeling and motion animation, we extended a previous work that is based upon the human hand anatomy and controlled by a hand gesture coding system, which we called Hand Action Coding System (HACS) for codifying hand motion in terms of hand muscle action units (HAU). The HACS together with the anatomy-based hand model allows complex sequence of hand gestures to be animated using high-level textual scripts. Much time and human effort can be saved if the system is able to generate HAU script for hand motion animation directly from analyzing real hand motion. This is the objective of this project and this thesis presents the resulting techniques for achieving this objective. After reviewing the anatomy-based approach, the hand model has been subsequently refined and made more realistic. To prove that the HAU list can be identified from the orientation of the bone segments of the hand configuration and the anatomy approach will simulate the in-between animation sequence naturally, an interactive solution is developed to explain the techniques for generating HAU list from target gesture images. Then to build a fully automate gesture input and analysis system, the hand motion tracking, visual feature extraction, feature occlusion and other image-processing problems have to be considered. As human hand does not have any natural distinctive features, therefore, instead of using an uncovered hand, color markers are attached to a gloved hand to provide the necessary features. Techniques and algorithms are then used to select the key frames, detect the color landmarks, and determine the HAU script through analysis of the spatial and the anatomical constraint relationship of these landmarks. Other contributions of the proposed approach include (1) there is no limitation on the input gestures, images of any arbitrary gesture can be taken as input materials, (2) occlusion problem is resolved or reduced to a minimum extent through the analysis of the spatial and the anatomical constraint relationship of the existing landmarks, (3) no high-speed motion tracker is needed as the system is able to generate and animate the intermediate motion from only the key gesture frames appeared in the video sequence of hand motion, (4) any section of the video sequence of hand motion can be extracted and reused without the redo of the whole motion capture process, and (5) a gesture database can be easily built incrementally and automatically. As a by-product, the gesture analysis results can potentially be applied to inferring which muscle or group of muscles are involved in a gesture.
Online Catalog Link: http://lib.cityu.edu.hk/record=b1761109
Appears in Collections:CS - Doctor of Philosophy

Files in This Item:

File Description SizeFormat
fulltext.html157 BHTMLView/Open
abstract.html157 BHTMLView/Open

Items in CityU IR are protected by copyright, with all rights reserved, unless otherwise indicated.

 

Valid XHTML 1.0!
DSpace Software © 2013 CityU Library - Send feedback to Library Systems
Privacy Policy · Copyright · Disclaimer