Skip navigation
Run Run Shaw Library City University of Hong KongRun Run Shaw Library

Please use this identifier to cite or link to this item: http://dspace.cityu.edu.hk/handle/2031/9555
Full metadata record
DC FieldValueLanguage
dc.contributor.authorHon, Hing Tingen_US
dc.date.accessioned2023-03-15T10:12:06Z-
dc.date.available2023-03-15T10:12:06Z-
dc.date.issued2022en_US
dc.identifier.other2022cshht782en_US
dc.identifier.urihttp://dspace.cityu.edu.hk/handle/2031/9555-
dc.description.abstractIn recent years, wearable technologies have stimulated the development of various Internet of Things (IoT) applications to access the user status with pervasive physiological data processing. However, existing touch-based smartwatches still suffer from two hands occupation and finger occluding problems, while the usability of smartwatch interfaces is restricted by its tiny screen and interaction approach that is inconvenient and requires excessive physical effort to interact. Hence, this project presented a gesture-based touch-free interaction system that leverages the wrist-mounted characteristic of smartwatches to capture arm, hand, and finger motion data collected with the built-in accelerometer and gyroscope units for gesture recognition. The proposed system employs a deep learning model to classify motion data and recognize the single-handed gestures and in-air writings from users to perform different inherent operations in a small interface without external equipment. This project proposed a lightweight multi-head one-dimensional convolutional neural network architecture with a gesture motion detection mechanism to interpret different sensor data sources and operate robust gesture recognition locally on a standalone smartwatch. This system captures real-time hand motion at any starting point to recognize 15 gesture vocabularies and 5 in-air writing gestures. This project also constructed different gestural models using different network configurations to compare and evaluate the performance of the proposed network architecture. The proposed model with appropriate data augmentation achieves a user-independent gesture recognition accuracy of 98.46%, which gains the most accuracy improvement compared with other network designs and achieves a satisfactory generalization result in recognizing the designed gestures from different users. This project further demonstrates two working application instances, including the music player and message application, which are supported by the gestural interaction system to justify the convenience and feasibility under different layout settings. Users can directly activate specific functions or perform precise selection with single-handed gestures to achieve intuitive touch-free interaction on a tiny smartwatch interface. The system maintains complete interface visibility and accurate layout elements selection by eliminating the precise physical interaction when controlling smartwatch applications.en_US
dc.rightsThis work is protected by copyright. Reproduction or distribution of the work in any format is prohibited without written permission of the copyright owner.en_US
dc.rightsAccess is restricted to CityU users.en_US
dc.titleGestures-based Touch-Free smartwatch development with deep learningen_US
dc.contributor.departmentDepartment of Computer Scienceen_US
dc.description.supervisorSupervisor: Dr. Xu, Weitao; First Reader: Dr. Zhu, Kening; Second Reader: Prof. Liang, Weifaen_US
Appears in Collections:Computer Science - Undergraduate Final Year Projects 

Files in This Item:
File SizeFormat 
fulltext.html148 BHTMLView/Open
Show simple item record


Items in Digital CityU Collections are protected by copyright, with all rights reserved, unless otherwise indicated.

Send feedback to Library Systems
Privacy Policy | Copyright | Disclaimer