<fmt:message key='jsp.layout.header-default.alt'/>  
 

DSpace@UM >
Faculty of Computer Science and Information Technology >
Academic Exercise (Bachelor of Computer Science) >

Please use this identifier to cite or link to this item: http://hdl.handle.net/1812/813

Title: Stereo vision and path finding for indoor/outdoor robot navigation: identification of object location
Authors: Wong, Mun Tat
Keywords: Stereo vision
Path finding
Robot navigation
Issue Date: Oct-2009
Publisher: University of Malaya
Abstract: In the project, we implemented the stereo vision and path finding for indoor robot navigation. In the past time, robot uses some basic sensor such as ultrasound, infrared and laser in the navigation task. They were not as intelligent as what we thought. Due to latest technology, stereo vision and path finding which able to diminish the limitations of former system is found. It can perform robust and real time system in the navigation task. Hence, we implemented it for indoor robot navigation. Two separated e-puck robot will be used in the system as stereo vision and navigate in an area of white surrounded wall. In the navigation task, they will search a path to reach target by avoiding obstacles. For the introduction section, there are overview, problem statement, objective and aims, scope and limitation, project outcome, and timeline being discussed. Articles had been studied and the review done will be included in Literature Review section. Besides, overview, tools, and scope of work also included in methodology section. In the stage of Identification of the Object Location in the scope of work, there are some techniques such as color detection, HSV detection, and edge detection are used to identify the object such as RED (target), BLUE (obstacle), GREEN (robot) and WHITE (background). Identification of the object can be done because each color contains distinct color value. Primary color will be used as object because it is a pure color and easier to be identified. Relative distance will be determined by the threshold value of white pixel of the edge of the obstacle and 15 columns in front of the robot in the image. Edge detection technique will be used to detect the white pixel of the object. If certain threshold value of white pixel of the edge of the object appeared in the upper 20 rows in the image, it means the robot is near to the object or else the robot is far to the robot. Besides, color detection technique also been used to detect red pixel within 15 columns in front of the robot in the image. If certain threshold value of red pixel detected within 15 columns in the image, it mean the robot is near to the RED (target). Meanwhile for System Design section, there will have interface, flow chart system and configuration and testing. For the System Testing, simple navigation process and scenarios will be discussed. Besides, problem and solutions, aim and objective achieved, system strength and system weakness and future enhancement will be discussed in the discussion and system evaluation section. Lastly, a conclusion was drawn and the related references list attached.
Description: Academic Exercise (B.C.S.) Faculty of Computer Science & Information Technology, University Malaya, 2009.
URI: http://dspace.fsktm.um.edu.my/handle/1812/813
Appears in Collections:Academic Exercise (Bachelor of Computer Science)

Files in This Item:

File Description SizeFormat
Chapter 4.pdfChapter 4400.2 kBAdobe PDFView/Open
Chapter 8.pdfChapter 8224.49 kBAdobe PDFView/Open
Chapter 6.pdfChapter 6270.18 kBAdobe PDFView/Open
Chapter 7.pdfChapter 7221.64 kBAdobe PDFView/Open
Chapter 2.pdfChapter 2437.41 kBAdobe PDFView/Open
Chapter 1.pdfChapter 1500.74 kBAdobe PDFView/Open
Chapter 3.pdfChapter 3404.28 kBAdobe PDFView/Open
Chapter 5.pdfChapter 53.21 MBAdobe PDFView/Open


This item is protected by original copyright



Your Tags:

 

  © Copyright 2008 DSpace Faculty of Computer Science and Information Technology, University of Malaya . All Rights Reserved.
DSpace@UM is powered by MIT - Hawlett-Packard. More information and software credits. Feedback