Autonomous Robot Navigation System Research

Document Type:Research Paper

Subject Area:Engineering

Document 1

It does achieve the navigation by perception of the environment through sensors and the implementation by storing the images of the environment in which they have passed before. Through doing so, the system learns and is able to apply the knowledge acquired later. This sensors entails the proximity sensors and vision systems in which they are able to interpret the images and take the appropriate actions. The system will be able to distinguish between human beings and the building. It should be able to identify if this objects are moving and are stationary and at what time will it take for it to be closer to the object. It is also taken into considerations that the environment is made up of different obstacles which are static and moving of which the expectations of the robot is to overcome the collision with them. The various design procedures have been explained in details. What others have done in this area of robotics is also covered in broad to illustrate how their knowledge has been instrumental in the coming up with this project. Fuzzy logic which is an area in artificial intelligence has been deeply put into use. The programming and the running of the system based on the artificial intelligence makes use of fuzzy logic with small integration of semantic networks. This are just basically used to represent the knowledge in a way that it becomes easy for the implementation. With that representation, programing becomes easier to implement. The system has been made autonomous by the deployment of various learning agents that are able to perceive the environment.

Sign up to view the full document!

This is done by giving the system some standards. The system is then fed by information from the sensors which passes by the critic and then the element is able to learn and effect the performance element. The various components and methodology which in cooperate the design of the system is well covered in this paper. The components are discussed in detail to give the reader who is not conversant with them the background information and appreciate their use in this project. It is through the understanding of this components that justifies why they were selected. Finally mapping, localization, path planning and perception is discussed in this paper. Acknowledgments Firstly I give great thanks to the almighty for giving me wisdom and strength to work on this project. Statement of the problem 8 1.

Sign up to view the full document!

Environmental characteristics 9 2 Literature survey 11 3 Methodology and design 14 3. Tasks and subtasks 14 3. Design of the control system 14 3. Mechanical design of the moving components 16 3. Motion control 22 3. Perception, localization and cognition 22 3. Mapping 23 3. Path planning 24 3. Implementation 25 Conclusion 27 References 28 List of Figures Figure 1: Control system of an autonomous navigation robotic system 15 Figure 2: Behaviour based control illustration 16 Figure 3: Flowchart on how obstacles will be avoided by the robot 20 List of tables Table 1: Values of the distance to landmark in cm 24 Table 2: Values of the angle orientation to the landmark in degrees 24 Table 3: Arduino board hardware specification 4 1 Introduction 1. This will define the control systems that will be used and the mapping in place. With industrialization being key in most countries, the replacement of human labour with robots has been seen as so much economical to many industries (Saffiotti and Alessandro).

Sign up to view the full document!

This is with the fact that the industries are able to perform so many processes in a given day without the parameters being exhausted. The movement of materials in the industry floor has become so important which require heavy machinery that can easily be achieved by the use of robots. In the field of motor vehicle, there has been a desire need by many people to have cars that are self-driven which can even reduce the incidences of accident and also save on manpower. of Violence). From research it showed that more than 90% of this accidents were caused by the human errors. Only 5% is documented to be caused by the vehicles that were defective. This brings the necessity of replacing the human driven vehicles with robotic assisted vehicles since they are not affected by human errors.

Sign up to view the full document!

This errors usually come in the form of being drunk and driving or being just careless. This makes it important to look at the various characteristics of the environment that we are going to work in before the design. The environment is made up of the following: structured verses unstructured. In a structured environment everything is defined and it is always predictable. In such an environment there are always rules on how the things should be arranged in a given manner. In unstructured environment it is always unpredictable and the robot must use sensors to determine what to do. The system was further upgraded to enhance the use of landmark beacons that has been mainly known us “follow me” and “walking by gates”. This two algorithms are specifically designed to work together to achieve a common goal.

Sign up to view the full document!

The follow me algorithm basically try to align the position of the tower and the robot body with regard to the centre of the object it first recognized. This object should be recorded in the list in which it has been tracking in its memory. If it recognizes that this object is far it will be made to come near it while if it is near it will force it to roll away. The values should respond quickly or at real time so that the path of the robot can be adjusted. The advantage of this system is the ability of it to capture the information by the use of the ultrasonic sensors and know which obstacles are static and which ones are dynamic without having to depend on the previously recorded data or knowledge (Grzonka, Slawomir and Grisetti).

Sign up to view the full document!

It will then in turn be able to select which obstacles to avoid in the motion. Smartphone has been used in some extend to develop navigation system. In this system they used Bluetooth as a link between the smartphone and the robot under control. One of the piece consisted of the control, the other for the drive and the last one was for the sensors (Mataric). The interface for the windows platform was done by the use of the visual studio. A case-based reasoning (CBR) system was brought into existence to solve the problems that were associated in the navigation of the robots. For this case they were heavily based on the unknown environment which was semi structured. To solve the issue, they integrated a CBR agent as one of the agents in the robot.

Sign up to view the full document!

They can basically be discussed in this proposal as follows. Design of the control system In this, the input to the system has to be established with the relevant outputs. It is under this section that the control of the input is done by the use of the outputs. The inputs come from the sensors while the outputs are what we are illustrated by the use of actuators. The project is going to use the planning and reactive mixture in the control process. This is because there could be some step functions that might influence the overall output. With the behaviour based or also known as reactive based control system, the robot will be equipped with behaviours. This behaviours will be simple but so many in number. The behaviours will be associated with its own sensors that will determine the sensor data to be evaluated.

Sign up to view the full document!

Coordination is used to resolve the interactions that will exist among the behaviours. Gears, wheel, robot leg and robot arm are equally important to the design (Choset). The design should made in such a way that their operation becomes so swiftly. Software design of the system This will entail coming up with the software that will run the whole machine. Programing is essential since it lowers the need of complex circuits. Some circuits are easily implemented by the use of programming. Its use in this such project is to enable capture the images by the use of the camera (Arkin). The library also gives the capability of processing this images and to draw the maps so as to extract some specific characteristics that will help in the navigation of the robot.

Sign up to view the full document!

Power supply It is necessary to supply power to the motors and other electric circuit. The various components require different ratings. Some motors are rated at voltage of 12V while others are at 5V. Components description We look at the materials that we are going to be working with so that we can establish their requirements and avoid failures during their use. Knowing power consumption of the individual helps establish the most efficient way of working. Arduino development board This is an open source microcontroller that is made of basically hardware and software components that can be programmed. This product is licensed as open source in both the hardware and software by the GNU Lesser General license which gives a permit to anyone to develop both the hardware and software (Mataric).

Sign up to view the full document!

It is made up of different pins that are defined as the input/output that maybe connected to the external circuitry. The gears change the direction of movement and also control the speed. Circuit board and circuit elements After finishing the design, it is very important to manufacture the boards that will work to ensure that the embedded system is up to work. Apart from the microcontroller, we need the other components like the resistors, capacitors and transistors that will operate the sensors. The output from the microcontroller is fed to the transistors via the resistors. Block diagram and flowchart Below is a flowchart that is used to illustrate the way the system will be avoiding the obstacles. When the system will have learned, it will be able to identify itself where it is related according to the elements that are in the environment.

Sign up to view the full document!

Since the system is to work in various environment with different characteristic, this learning will be important. Microprocessors/ microcontrollers This is the brain of the robot. It is here that decisions are made for the control of the robot. All the sensors act as an input to this part. The updates are made possible through the recognition of some features that are special in the entire landmark, probabilistic models and sensor data. It is through localization that the robot can determine what should be done next. This makes it key in the designing of autonomous robots. A robot has to be given access to both relative and absolute measurements. With this information it can try to establish the location as accurately as possible (Negenborn). where the robot does not have the location of the target and in the second scenario is when the robot has the location of the target.

Sign up to view the full document!

For the first case the robot is expected to navigate through the environment as it identify the features and as well as it tries to find the target goal. In case two, the robot is given the target goal but it moves to it as it identifies the landmark features that exist in the environment. There are factors that determine the hardness of the mapping problem this factors include: the size of the environment in which the robot has to identify the landmarks. Secondly the noise that affect the perception and actuation and finally different locations being much alike to the robot. Table 1: Values of the distance to landmark in cm Landmark position Orientation measured in degrees Absolute Error in degrees Percentage Relative Error 0 11 22 -22 2. Table 2: Values of the angle orientation to the landmark in degrees 3.

Sign up to view the full document!

Path planning Path planning involves looking ahead of the actions to know the possible outcomes before the actions are performed. In this we try to find the possible path that the robot will follow to reach the desired goal (Garcia, Porta and Montiel). When planning much concern is placed on the complexity and size of the environment which also dictates the cost. With this two forces it enables the robot to successfully avoid the obstacles as it remains on the course of attaining the goal. Implementation The implementation of the autonomous robot navigation system involves the coding of the arduino board and compilation of the code by the use of a software designed for arduino programming. This board is very popular in the design of microcontroller projects since it is easy to use.

Sign up to view the full document!

The board is made up of a microcontroller that is programmable to save on the circuit. The microcontroller is then connected on other circuits which can enable its operation. c which can run bigger motors. The desired transistor for the switch will be 2N2222. Conclusion In summary, there are various ways in which we can develop systems that are more autonomous and very self-reliant. This depends more on the technology we are using. Arduino board has been used in this project as a microcontroller but various microprocessors can be used to make the autonomous systems through programming. Maximum Lego NXT: Building Robots with Java Brains. Variant Press, 2007. Bonin-Font, et al. “Visual navigation for mobile robots: A survey. ” Journal of intelligent and robotic systems 53. Grzonka, et al. “Towards a navigation system for autonomous indoor flying.

Sign up to view the full document!

” Robotics Automation. Ed. IEEE. Master's thesis. Utretch: Utretch university, 2003. Pearlmutter and Barak. “Learning state space trajectories in recurrent neural networks. ” NeuralComputation 1. ” IEEE Transactions on industrial Electronics 58. World Health Organization. Dept. of Violence, et al. “Global status report on road safety: time for action. beginTFT(); Robot. begin(); // draw "lg0. bmp" and "lg1. bmp" on the screen Robot. displayLogos(); // draw init3. keyboardRead(); // read the button values switch (keyPressed) { case BUTTON_LEFT: // display previous picture if (--i < 1) i = NUM_PICS; return; case BUTTON_MIDDLE: // do nothing case BUTTON_RIGHT: // display next picture if (++i > NUM_PICS) i = 1; return; case BUTTON_UP: // change mode changeMode(-1); return; case BUTTON_DOWN: // change mode changeMode(1); return; } } } // if controlling by the compass void compassControl(int change) { // Rotate the robot to change the pictures while (true) { // read the value of the compass int oldV = Robot.

Sign up to view the full document!

compassRead(); //get the change of angle int diff = Robot. compassRead() - oldV; if (diff > 180) diff -= 360; else if (diff < -180) diff += 360; if (abs(diff) > change) { if (++i > NUM_PICS) i = 1; return; } // chage modes, if buttons are pressed int keyPressed = Robot. keyboardRead(); switch (keyPressed) { case BUTTON_UP: changeMode(-1); return; case BUTTON_DOWN: changeMode(1); return; } delay(10); } } // Change the control mode and display it on the LCD void changeMode(int changeDir) { // alternate modes mode += changeDir; if (mode < 0) { mode = 1; } else if (mode > 1) mode = 0; // display the mode on screen Robot. fill(255, 255, 255); Robot.

Sign up to view the full document!

From $10 to earn access

Only on Studyloop

Original template

Downloadable