Please use this identifier to cite or link to this item:
|Title:||Action Based Robot Controlling System|
|Authors:||Tong, Ka Po|
|Department:||Department of Computer Science|
|Supervisor:||Supervisor: Dr. Lam, Kam Yiu|
|Abstract:||Due to the rapid advancements in software and electronic industries, the development of non-industrial robots (in the remaining part of the report, they are called robots) has received great interests in recent years. Since these robots can be used in many daily living applications, e.g., home assistants and security robots, and most of the users of these robots are common people including elderly, it is important to design a simple and efficient way to control the operations of the robots. In this project, we have investigated the design and development of a control system called Action Based Robot Controlling System (hereafter referred as ABRCS) to control a robot to perform simple tasks, e.g., to grab a specific object located within an indoor environment. In ABRCS, we adopt a tap-to-interact interface to allow the user to select "which" object and "what" action to be performed by the robot using the camera embedded in a smartphone. To achieve the action-based control, ABRCS is integrated with an Object Map Constructor to recognize the location of the selected object within sight of the camera of the robot. To improve the accuracy, the Object Map Constructor uses the machine learning library - TensorFlow to identify an object in the image captured from a camera installed in the robot. Then, the image and features gathered will be analyzed and placed in a 3D data structure for further comparison with the object images obtained through the smartphone. This will allow the Robot and Mobile Phone could share the same object pool and let the Robot knows what object the user wants to interact with.|
|Appears in Collections:||Computer Science - Undergraduate Final Year Projects |
Items in Digital CityU Collections are protected by copyright, with all rights reserved, unless otherwise indicated.