Please use this identifier to cite or link to this item:
http://dspace.cityu.edu.hk/handle/2031/9555
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Hon, Hing Ting | en_US |
dc.date.accessioned | 2023-03-15T10:12:06Z | - |
dc.date.available | 2023-03-15T10:12:06Z | - |
dc.date.issued | 2022 | en_US |
dc.identifier.other | 2022cshht782 | en_US |
dc.identifier.uri | http://dspace.cityu.edu.hk/handle/2031/9555 | - |
dc.description.abstract | In recent years, wearable technologies have stimulated the development of various Internet of Things (IoT) applications to access the user status with pervasive physiological data processing. However, existing touch-based smartwatches still suffer from two hands occupation and finger occluding problems, while the usability of smartwatch interfaces is restricted by its tiny screen and interaction approach that is inconvenient and requires excessive physical effort to interact. Hence, this project presented a gesture-based touch-free interaction system that leverages the wrist-mounted characteristic of smartwatches to capture arm, hand, and finger motion data collected with the built-in accelerometer and gyroscope units for gesture recognition. The proposed system employs a deep learning model to classify motion data and recognize the single-handed gestures and in-air writings from users to perform different inherent operations in a small interface without external equipment. This project proposed a lightweight multi-head one-dimensional convolutional neural network architecture with a gesture motion detection mechanism to interpret different sensor data sources and operate robust gesture recognition locally on a standalone smartwatch. This system captures real-time hand motion at any starting point to recognize 15 gesture vocabularies and 5 in-air writing gestures. This project also constructed different gestural models using different network configurations to compare and evaluate the performance of the proposed network architecture. The proposed model with appropriate data augmentation achieves a user-independent gesture recognition accuracy of 98.46%, which gains the most accuracy improvement compared with other network designs and achieves a satisfactory generalization result in recognizing the designed gestures from different users. This project further demonstrates two working application instances, including the music player and message application, which are supported by the gestural interaction system to justify the convenience and feasibility under different layout settings. Users can directly activate specific functions or perform precise selection with single-handed gestures to achieve intuitive touch-free interaction on a tiny smartwatch interface. The system maintains complete interface visibility and accurate layout elements selection by eliminating the precise physical interaction when controlling smartwatch applications. | en_US |
dc.rights | This work is protected by copyright. Reproduction or distribution of the work in any format is prohibited without written permission of the copyright owner. | en_US |
dc.rights | Access is restricted to CityU users. | en_US |
dc.title | Gestures-based Touch-Free smartwatch development with deep learning | en_US |
dc.contributor.department | Department of Computer Science | en_US |
dc.description.supervisor | Supervisor: Dr. Xu, Weitao; First Reader: Dr. Zhu, Kening; Second Reader: Prof. Liang, Weifa | en_US |
Appears in Collections: | Computer Science - Undergraduate Final Year Projects |
Files in This Item:
File | Size | Format | |
---|---|---|---|
fulltext.html | 148 B | HTML | View/Open |
Items in Digital CityU Collections are protected by copyright, with all rights reserved, unless otherwise indicated.