42
Autonomous SLAM Robot Project Report MECHENG 706 Group 17 Shuchen Huang Cathryne Herrera Huijun Wang Yi Lin Huang

Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

Embed Size (px)

Citation preview

Page 1: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

Autonomous SLAM Robot

Project Report MECHENG 706

Group 17

Shuchen Huang

Cathryne Herrera

Huijun Wang

Yi Lin Huang

Page 2: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

Executive Summary This report discusses the development of a simultaneous localization and mapping robot that navigates itself around a room for the purpose of automatic vacuum cleaning. Specifications include that it must avoid obstacles and walls, cover the maximum area in the minimum amount of time, and produce a map of the environment and its path. A robot platform already exists that the project builds upon. It consists of four Mecanum wheels providing omnidirectional movement, and is controlled by an Arduino Mega microcontroller with a sensor shield add-on.

The strategy used to accomplish this task was to have the robot move back and forth along the length, and along the width for a small distance at each end. A quick way to describe this is that it produces movement identical to the pattern found on foil strain gauges. The method is intuitive yet effective. Starting off in an unknown position, the robot finds and aligns to a wall. From there it traverses the perimeter of the room to determine its dimensions and arrive at a corner facing along the length. It then proceeds with room coverage using the method described.

Three IR sensors are mounted on the front which provides complete coverage of potential obstacle positions. The robot avoids obstacles by strafing to the side to move around them, during which it needs to monitor the side distances so as to not strafe into something else. It does this with an additional sensor on one side and by rotating one of its front sensors mounted on a servo motor to face the other side.

An MPU-9150 chip containing an accelerometer and gyroscope is used to obtain yaw measurements to keep the robot at desired angles. Data from the two sensors are fused with the Kalman filter. The MPU’s magnetometer is not used because of local magnetic fields on the robot platform. A proportional controller continually adjusts the robot’s angle and is very effective (errors only arise from gyroscope errors).

The robot uses dead reckoning and the MPU’s yaw readings to estimate its position. It encodes the mapping information with prefixes and sends them to the Bluetooth serial port. An app written in C# WPF extracts mapping information from the serial port and graphically displays the room, obstacles, and the robot’s path.

At the end of the project the robot could consistently achieve an 80+% room coverage in under 2 minutes, and produce a map displaying all the features with a decent amount of accuracy.

i

Page 3: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

Table of Contents Executive Summary ....................................................................................................................... i

1.0 Introduction ........................................................................................................................... 1

2.0 Problem Description and Project Specifications ................................................................... 1

2.1 Problem Description .......................................................................................................... 1

2.2 Project Specifications ........................................................................................................ 1

3.0 Systems Design and Integration ............................................................................................ 2

3.1 IR and Sonar Sensor ........................................................................................................... 2

3.2 MPU ................................................................................................................................... 5

3.3 Motion Control .................................................................................................................. 5

3.4 State Diagram .................................................................................................................... 8

3.5 Initialisation ....................................................................................................................... 9

3.6 Field Detection ................................................................................................................ 11

3.7 Path of Room Coverage ................................................................................................... 13

3.8 Obstacle Avoidance ......................................................................................................... 17

3.9 Localisation and Mapping ................................................................................................ 26

4.0 Testing and Results .............................................................................................................. 29

4.1 Sensor Testing ................................................................................................................. 29

4.2 Field Detection Testing .................................................................................................... 31

4.3 Room Coverage Testing ................................................................................................... 32

4.4 Obstacle Avoidance Testing ............................................................................................ 32

5.0 Discussion and Future Improvements ................................................................................. 33

6.0 Conclusions .......................................................................................................................... 34

7.0 References ........................................................................................................................... 35

Appendix A ................................................................................................................................ A1

Appendix B................................................................................................................................. B1

Appendix C ................................................................................................................................. C1

ii

Page 4: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

Table of Figures Figure 1 Sensor Placement 1 ........................................................................................................... 2 Figure 2 Sensor Placement Concept 2 ............................................................................................. 3 Figure 3 Final Sensor Placement ..................................................................................................... 4 Figure 4 Servomotor and Sensor 3 .................................................................................................. 4 Figure 5 Capacitor filter ................................................................................................................... 5 Figure 6 Angle control effort on wheel speeds for clockwise difference ........................................ 6 Figure 7 Angle control effort on wheel speeds for counter-clockwise difference .......................... 7 Figure 8 Overall State Diagram ........................................................................................................ 8 Figure 9 Algorithm for initialisation ................................................................................................. 9 Figure 10 Robot rotating anti-clockwise when right distance further than left ........................... 10 Figure 11 Algorithm for field detection ......................................................................................... 12 Figure 12 Zigzag path ..................................................................................................................... 13 Figure 13 Spiral path ...................................................................................................................... 14 Figure 14 Algorithm for room coverage ........................................................................................ 16 Figure 15 Path of movement for room coverage .......................................................................... 17 Figure 16 Obstacle avoidance algorithm ....................................................................................... 18 Figure 17 General obstacle avoidance .......................................................................................... 19 Figure 18 Obstacle avoidance with boundary on the right ........................................................... 20 Figure 19 Obstacle avoidance with boundary on the left ............................................................. 20 Figure 20 Boundary avoidance with anti-clockwise rotation ........................................................ 21 Figure 21 Boundary avoidance with clockwise rotation ............................................................... 21 Figure 22 Boundary avoidance followed by obstacle avoidance .................................................. 22 Figure 23 Obstacle avoidance followed by boundary avoidance .................................................. 22 Figure 24 Obstacle detection ........................................................................................................ 23 Figure 25 Boundary detection ....................................................................................................... 24 Figure 26 Clearing obstacle on the side ........................................................................................ 25 Figure 27 Map produced from a full run ....................................................................................... 27 Figure 28 Demonstration robot path recorded by camera ............... Error! Bookmark not defined. Figure 29 Green triangle marking the robot's current position .................................................... 28 Figure 30 Initial IR sensor performance ........................................................................................ 29 Figure 31 Sonar sensor measurements ......................................................................................... 30 Figure 32 Range of sonar sensor ................................................................................................... 30 Figure 33 IR sensor performance after the implementation of filter ........................................... 31

List of Tables Table 1 Analysis on sensor performance ....................................................................................... 30

iii

Page 5: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

1.0 Introduction This project aims to design and build an autonomous Simultaneously Localisation and Mapping (SLAM) robot to perform the task of room vacuum cleaning. The SLAM robot has the capability of building a map of an unknown environment and localising the robot relative to the environment model simultaneously [1]. The robotic vacuum cleaner is able to help people perform boring and time-wasting domestic cleaning tasks [2]. There are two most widely known robotic vacuum cleaners, the iRobot Roomba and the Neato. IRobot Roomba is intelligent enough to avoid obstacles such as furniture and reach areas such as wall corners when cleaning the floor. However, it moves randomly and therefore takes longer to complete the cleaning task. Apart from the features of avoiding obstacles, Neato is also able to apply the SLAM algorithm to map the house, localise the robot and plan the path which saves a lot of cleaning time [3].

This report describes the problem and outlines the project specifications at the start. The initial concepts and developments of different systems including sensors, motion control, initialisation, field detection, room coverage and obstacle avoidance are explained afterwards. Finally, the testing and trial run results are discussed and recommendations for future developments are made.

2.0 Problem Description and Project Specifications 2.1 Problem Description The SLAM vacuum cleaning robot aims to navigate and map a platform containing three obstacles without collision, whilst also achieving the maximum amount of platform coverage possible.

In the development of this project, expertise in motion actuation, sensors and microcontroller programming are required along with the skills of systems thinking and integration.

A pre-assembled robot chassis has been provided and a selection of sensors to implement the project objectives have been given.

2.2 Project Specifications 2.2.1 Performance Specifications 2.2.1.1 Motion

• The robot should be autonomous. • The robot should move with a speed as fast as possible (without affecting other

performances). • The robot should move in demanded directions without deviation or overshoot.

2.2.1.2 Room Coverage • The robot should be able to find and start at the proper starting position for room

coverage. • The robot should be capable of deciding its path (i.e. move along long/short wall in the

room) • The robot should cover the entire field of the room. • The robot should stop automatically when room coverage is completed.

2.2.1.3 Obstacle Avoidance • The robot should be able to detect and avoid obstacles on the way of motion.

1

Page 6: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

• The robot should be able to go back to original track after obstacle avoidance. • The robot should be able to pick the shortest track in obstacle avoidance. • The robot should not crash to any wall.

2.2.1.4 Localisation and Mapping • The robot should be able to map the field dimensions. • The localisation of the robot in relation to the filed should tracked in a real-time manner. • The robot should be able to map location of obstacles.

2.2.2 Design Specifications • The robot should only include 5 given sonar and IR sensors. • The robot should leave enough space for paper attachment during demonstration.

3.0 Systems Design and Integration 3.1 IR and Sonar Sensor 3.1.1 Initial Concepts The first concept developed (see figure 1) was to program the robot to move translational. So during room coverage, it would not require turning, and could save up time taken for running. In order to achieve this, four long-range IR sensors were considered to be installed on each side of the robot for obstacle detection and localizing the robot’s current position. And an ultrasonic sensor was considered to be placed in the middle and attaching to a servomotor and a gear box, so that the sensor could rotate 360 degrees to detect if there is any obstacle on the way of motion.

Figure 1 Sensor Placement 1

Testing results have shown instability of long rage IR sensor readings, which would cause inaccuracy in localization. The sensors also have a long dead band 12cm, which would cause blind spot around the robot’s perimeter. Sonar readings require further interpretation to identify the

2

Page 7: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

actual obstacle location, and sonar signals are exposed to large interruption as discussed in sonar sensor testing .Therefore this idea was expelled.

The second concept (see figure 2) was to have two long-range IR sensors on the left side to always align the robot to the walls in field. And one medium range IR sensors on the left side and one on the end. This was intended for obstacle detection when the robot strafe to the right or reverse. The sonar sensor was planned to attach to a servomotor and always turns around to detect obstacles near the robot.

Figure 2 Sensor Placement Concept 2

The reason this idea was abandoned was because it did not have enough sensors for avoidance and used too much sensors to align to wall but gyro scope could be used for this purpose.

3.1.2 Sensor Placement Sensors are positioned in such way so the robot can detect any obstacle on its way. The details of sensor placement are shown in figure 3. The overall system features are shown in Appendix A.

3

Page 8: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

Figure 3 Final Sensor Placement

Sensors facing front, S1, S2 and S3 (front is relative to robot’s motion) are used for boundary (wall) detection and obstacle detection. S1 and S2 are long-range IR sensors placed at the very edges (left and right side) of the vacuum cleaner. This sensor arrangement prevents structures extending out from main body from crashing with obstacles or walls. S3 is a short-range IR sensor. This sensor is placed in the middle of the robot, in order to detect any obstacles on the blind spot between S1 and S2. The dead bands of sensor S1, S2 and S3 are 10cm, 10cm and 4cms. These distances are offset on robot, so there will not be blind spot in front of the robot.

Medium range IR sensor S4 placed on the right is intended to detected walls on the right side of the robot, as well as to identify whether robot has passed the obstacle. The details will be discussed in Section 3.8. For this particular reason, S3 sensor is joined with a servomotor so that it could be rotated 90 degrees to face the left side which is shown in figure 4. The detailed mechanical drawing of the structure holding the servomotor is shown in Appendix B.

In order to reduce measuring error of moving direction of objects, all IR sensors are placed vertically.

A Sonar sensor is placed on the left side of the robot to detect walls on the left of robot.

3.1.3 Filters Initial sensor testing shows that the signals contain high frequency noises. These noises were to be filtered in order to obtain more accurate and reliable readings. A number of software and hardware filters such as, moving average filter and capacitor, were installed and tested. Based on testing results, moving average software filter was chosen. This filters averages the most recent 18 readings to reduce and distribute errors evenly, which eliminated the effect of high frequency noises. After filter integration, the sensor readings update very 40 ms.

Testing of sensors before and after the implementation of filters are discussed in Section 4.1.

3.1.4 Capacitor filter A low pass RC filter was made and connected between the voltage input terminal of the sensor

Med IR S3

Figure 4 Servomotor and Sensor 3

4

Page 9: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

and the ground to stabilize power supply line as shown in Figure 5. The capacitor used is 100 µF. However the readings did not improve much compared to readings obtained just using moving average filter. Therefore it was unnecessary to use a capacitor.

3.2 MPU The robot uses an MPU-9150 motion tracking device to control its direction. The device interfaces with the Arduino Mega microcontroller using the I2C protocol, requiring the connections SDA, SCL, GND, and VCC. The I2Cdev Library handles communications. The device’s address is 0x68. The MPU-9150 contains its own digital motion processor to process and combine raw values from its sensors, and a MotionDriver library on Arduino accesses outputs from the DMP and performs additional operations on the data. A calibration library provides functions to calibrate the accelerometer and magnetometer data (calibration for the gyroscope is not required). The device is set to update at 20Hz and the MotionDriver library’s low pass filter is set to automatic.

The accelerometer’s data is combined with the gyroscope’s to improve accuracy using the Kalman filter. The gyroscope and accelerometer have to rely on integration for measurements so their errors will accumulate over time. Using the magnetometer or combining it with the gyroscope could have been a solution to this since it is used as a compass and measures the absolute value. However it is too easily interfered with by local magnetic fields. Its measurements are noisy and have to be heavily weighted down in comparison to the gyroscope, and most importantly the magnetic fields generated on the robot platform itself will rotate together with the MPU device locking up its readings and making it useless. To attempt to resolve this, the device was raised up and mounted on a pole so that it is as far as possible from other components on the robot, especially the motors, but this did not work as well as needed so in the end the magnetometer was not used. With proper calibration of the accelerometer, the gyroscope and accelerometer combination was accurate enough to keep the robot straight during translational motion. When the robot rotates on-spot through a large angle there is always an added error in the direction of rotation, and this was fixed to an extent by balancing it out with a percentage reduction in the rotated angle.

There is always some significant drift in readings at the beginning, so the robot stays still and waits for the readings to settle (i.e. stay within 0.01 for 2 seconds).

3.3 Motion Control The robot is kept in the desired orientation with yaw measurements from the MPU-9150 motion tracking device. The difference between the current yaw measurement and the yaw set point is used to adjust the wheel speeds of the robot such that it rotates to align with the yaw set point. The set point is first assigned when the robot finds and aligns to a wall, from then on it is changed when the robot needs to rotate and head in a different direction.

Vcc

10µF

Sensor

Figure 5 Capacitor filter

5

Page 10: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

The yaw readings range between negative and positive π, folding over to the other side at both ends. Normally the direction of the control effort would be decided by whether the difference between the feedback and the set point is positive or negative. However in this case if the difference between the current reading and the set point has an absolute value greater than π, it means that turning in the opposite direction would bring the angle closer to the set point because it crosses the roll-over point at ±π, so the actual difference is taken to have the following value with the inverted sign of the old difference.

2𝜋𝜋 − |𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑|

This would happen when the angle and set point are close to the roll-over point and one of them crosses over to the other side when the robot is disturbed or the set point is rotated.

The control effort is evaluated with a proportional controller. Initially a PI controller was written, but the P controller was very effective and the integral component added no benefits. The steady state error could be brought to insignificant levels just by having a high proportional gain because the motor speeds were uniform enough that there were no large active disturbances. The high gain also did not have its usual drawback which is the amplification of noise because the processed gyroscope readings had no noise at all. The integral term also had to be sign-inverted with the calculated difference in the condition explained above.

The angle control effort is added to wheel speeds on one side of the robot and subtracted from wheel speeds on the other which causes the robot to turn - both while stationary and moving in any direction. This is illustrated in figure 6 and figure 7.

Figure 6 Angle control effort on wheel speeds for clockwise difference

6

Page 11: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

Figure 7 Angle control effort on wheel speeds for counter-clockwise difference

The same controller is used to rotate the robot on-spot when changing directions – the angle of rotation is added to the set point and the robot reads the gyroscope and calculates the control effort until readings come close enough to the set point. There is some overshoot after the turn because of the robot’s stopping distance (angle) and possibly delays in the MPU. However these are ignored; the robot does not wait until the angle is fine adjusted but instead it saves time by moving on to the next motion while the angle controller continues working and quickly returns the angle to its precise set point. The motors require some level of input to activate and the P controller outputs will not reach this level for small enough angle differences, even ones that are outside the tolerance. This requires that the control effort be padded with constants (i.e. positive constants added to the control effort when it is positive and subtracted from when it is negative) when the robot rotates on-spot. This is not a problem when the robot is already in translational motion as the motors are already powered so they are sensitive to any additional changes to the input.

The motors have a maximum input value, so all requested outputs to the motors are scaled down, if required, by a factor needed to bring the maximum requested value to the input limit. This preserves the ratio of motor speeds and the desired motion.

7

Page 12: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

3.4 State Diagram The overall state diagram is shown below in figure 8. There are four main states of the program including initialisation, filed detection, room coverage and obstacle avoidance. During the initialisation state, the MPU is stabilised and the robot is aligned to the wall. Afterwards, the program progresses to the field detection state which measures the dimensions of the field and places the robot in the correct starting position. After successfully completing the field detection, the robot starts its room coverage and stops after the completion of room coverage. In all three states mentioned above, the program enters the obstacle avoidance state if the robot detects a wall or an obstacle. The localisation and mapping is based on the data collected from the field detection, room coverage and obstacle avoidance.

Initialisation

Start

Field Detection

Room Coverage

Obstacle Avoidance

Obstacle

AvoidanceCompleted

Obstacle

AvoidanceCompleted

State Completed

State Completed

StopState Completed

Localisation & Mapping

Data

Data

Data

AvoidanceCompleted

Obstacle

Figure 8 Overall State Diagram

8

Page 13: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

3.5 Initialisation The algorithm for initialisation state is shown in figure 9. The initialisation state starts with stabilising the MPU and is followed by aligning the robot to a wall.

Figure 9 Algorithm for initialisation

9

Page 14: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

To find and align to a wall, the robot moves forward. If one of its front IR sensors’ readings become low then it has come to a wall or an obstacle. If the other front IR sensors give moderate to large readings then it would appear to be an obstacle and the robot turns 90 degrees and continues forward. If the other readings are also reasonably small then it is most likely a wall and the robot will attempt to align.

To align to the wall the robot uses two of its front IR sensors. It slowly rotates in a direction evaluated from their readings. If the left sensor’s readings are larger than the right sensor’s readings the robot rotates clockwise, and otherwise anticlockwise. This continues until the left and right sensor readings have approached each other to within 1cm.

Figure 10 Robot rotating anti-clockwise when right distance further than left

Sometimes the sensor readings are inaccurate in such a way that the left and right distances are matched despite the robot being misaligned, and this generally happens at large normal-angles from the wall when the sensors are measuring surfaces that are further away and at more extreme angles. To overcome this, the robot does multiple alignments. After each alignment (except the last) the robot rotates a little more to get out of the previous orientation in case it was giving misleading sensor readings, and then aligns again. If the robot was already correctly aligned then it would always return to the correct alignment again because of the small normal angle. In this way, the risk of a bad alignment becomes smaller each time. In the end, to balance speed with accuracy, only two alignments are done.

10

Page 15: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

3.6 Field Detection After the initialisation state, the field detection starts and its algorithm is shown in figure 11. The main aim of this state is to measure the dimensions of the field using the sonar sensor and find the starting point of room coverage.

When the initialisation state completes, the side of the robot is aligned parallel to a random wall in the room. The robot moves forward and detects walls and obstacles in the meantime. If the robot detects a wall, it always makes a counter clockwise turn of 90o. This is due the placement of the sonar sensor on the left-hand side of the robot.

After completing the first turn, the robot moves forward along Wall 1 and an algorithm is implemented to find the maximum reading recorded by the sonar sensor until the robot detects another wall. The addition of the maximum reading and the width of the robot is considered to be the length of the wall adjacent to Wall 1. The same algorithm applied when the robot moves along Wall 2 and the two maximum reading values are compared after the robot detects another wall and two possible scenarios are discussed as follows:

• If the reading obtained from Wall 1 is larger than that obtained from Wall 2, it shows the length of Wall 2 is larger than the length of Wall 1 and the robot needs to complete another forward movement along the shorter wall so that it can start the room coverage along the longer wall.

• If the reading obtained from Wall 1 is smaller than that obtained from Wall 2, the length of Wall 2 is considered to be smaller than the length of Wall 1 and the robot is already parallel to the longer wall and ready to start the room coverage.

The obstacle avoidance algorithm in the field detection is slightly different from that in normal room coverage and is considered to be more straightforward. The robot always strafes to the left when an obstacle is detected by any of the three front sensors and moves forward when the front-right IR sensor (S2) clears the obstacle. It moves forward for a distance until the side-right IR sensor (S4) clears the obstacle and then strafes right back to its original path. The distance of right strafing is determined by the distance counter implemented in the codes.

11

Page 16: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

Figure 11 Algorithm for field detection

12

Page 17: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

3.7 Path of Room Coverage 3.7.1 Initial Concepts It is considered as an imperative step to determine the path of movement for the robotic vacuum cleaner so that it can achieve the optimum room coverage in an efficient manner. Two important criteria to assess the performance of the robot are the percentage of room coverage and the time to complete the room coverage. Two possible options including zigzag path and spiral path were analysed and considered carefully during the initial development stage.

3.7.1.1 Zigzag Path The sequence of the zigzag path is shown in figure 12 which starts from the forward movement of the robot, followed by a 90o turn when the robot sees a wall, a slight forward movement and finally a 90o turn again. This sequence is repeated until the robot covers the whole area.

The distance of the path travelled by the robot can be determined using the formula as follows:

𝑍𝑍𝑋𝑋 = (𝑊𝑊 − 𝑏𝑏𝑏𝑏

+ 1)(𝐿𝐿 − 𝑎𝑎)

(1)

𝑍𝑍𝑌𝑌 = (𝑊𝑊 − 𝑏𝑏)

(2)

where 𝑍𝑍𝑋𝑋 = 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑎𝑎𝑑𝑑𝑑𝑑𝑑𝑑 𝑑𝑑𝑑𝑑𝑎𝑎𝑡𝑡𝑑𝑑𝑡𝑡𝑡𝑡𝑑𝑑𝑑𝑑 𝑑𝑑𝑑𝑑 𝑑𝑑ℎ𝑑𝑑 𝑥𝑥 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑,𝑍𝑍𝑌𝑌 = 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑎𝑎𝑑𝑑𝑑𝑑𝑑𝑑 𝑑𝑑𝑑𝑑𝑎𝑎𝑡𝑡𝑑𝑑𝑡𝑡𝑡𝑡𝑑𝑑𝑑𝑑 𝑑𝑑𝑑𝑑 𝑑𝑑ℎ𝑑𝑑 𝑦𝑦 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑,

𝐿𝐿 = 𝑡𝑡𝑑𝑑𝑑𝑑𝑙𝑙𝑑𝑑ℎ 𝑑𝑑𝑑𝑑 𝑑𝑑ℎ𝑑𝑑 𝑤𝑤𝑎𝑎𝑡𝑡𝑡𝑡,𝑊𝑊 = 𝑤𝑤𝑑𝑑𝑑𝑑𝑑𝑑ℎ 𝑑𝑑𝑑𝑑 𝑑𝑑ℎ𝑑𝑑 𝑤𝑤𝑎𝑎𝑡𝑡𝑡𝑡, 𝑎𝑎 = 𝑡𝑡𝑑𝑑𝑑𝑑𝑙𝑙𝑑𝑑ℎ 𝑑𝑑𝑑𝑑 𝑑𝑑ℎ𝑑𝑑 𝑑𝑑𝑑𝑑𝑏𝑏𝑑𝑑𝑑𝑑,

𝑏𝑏 = 𝑤𝑤𝑑𝑑𝑑𝑑𝑑𝑑ℎ 𝑑𝑑𝑑𝑑 𝑑𝑑ℎ𝑑𝑑 𝑑𝑑𝑑𝑑𝑏𝑏𝑑𝑑𝑑𝑑.

By substituting L = 2000 mm, W = 1200 mm, a = 200 mm and b = 200 mm, ZX is determined to be 10800 mm and ZY is determined to be 1000 mm thus the total length travelled by the robot in zigzag path is 10900 mm.

3.7.1.2 Spiral Path The inward spiral path is shown in figure 13 which starts from the forward movement of the robot by one side of the walls, followed by three turns and moving forward when the robot sees the

L

W

a

b X

Y

Figure 12 Zigzag path

13

Page 18: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

corner. The sequence is repeated with decreased forward distance and the robot finally ends up at the centre of the room.

The distance of the path travelled by the robotic vacuum cleaner can be calculated as follows:

𝑆𝑆𝑋𝑋 = (𝐿𝐿 − 𝑎𝑎) + � (𝐿𝐿 − 𝑎𝑎 − 𝑎𝑎𝑑𝑑)𝑖𝑖𝑚𝑚𝑚𝑚𝑚𝑚

𝑖𝑖=0

(3)

𝑆𝑆𝑌𝑌 = � (𝑊𝑊 − 𝑏𝑏 − 𝑗𝑗𝑏𝑏)𝑗𝑗𝑚𝑚𝑚𝑚𝑚𝑚

𝑗𝑗=0

(4)

𝑤𝑤ℎ𝑑𝑑𝑑𝑑𝑑𝑑 (𝑑𝑑 + 1) = 𝑑𝑑ℎ𝑑𝑑 𝑑𝑑𝑛𝑛𝑛𝑛𝑏𝑏𝑑𝑑𝑑𝑑 𝑑𝑑𝑑𝑑 𝑑𝑑𝑑𝑑𝑑𝑑𝑤𝑤𝑎𝑎𝑑𝑑𝑑𝑑 𝑛𝑛𝑑𝑑𝑡𝑡𝑑𝑑𝑛𝑛𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑 𝑑𝑑ℎ𝑑𝑑 𝑑𝑑𝑑𝑑𝑏𝑏𝑑𝑑𝑑𝑑 ℎ𝑎𝑎𝑑𝑑 𝑑𝑑𝑑𝑑𝑛𝑛𝑐𝑐𝑡𝑡𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑 𝑑𝑑𝑑𝑑 𝑑𝑑ℎ𝑑𝑑 𝑥𝑥 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑

𝑎𝑎𝑑𝑑𝑑𝑑 𝑗𝑗 = 𝑑𝑑ℎ𝑑𝑑 𝑑𝑑𝑛𝑛𝑛𝑛𝑏𝑏𝑑𝑑𝑑𝑑 𝑑𝑑𝑑𝑑 𝑑𝑑𝑑𝑑𝑑𝑑𝑤𝑤𝑎𝑎𝑑𝑑𝑑𝑑 𝑛𝑛𝑑𝑑𝑡𝑡𝑑𝑑𝑛𝑛𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑 𝑑𝑑ℎ𝑑𝑑 𝑑𝑑𝑑𝑑𝑏𝑏𝑑𝑑𝑑𝑑 ℎ𝑎𝑎𝑑𝑑 𝑑𝑑𝑑𝑑𝑛𝑛𝑐𝑐𝑡𝑡𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑 𝑑𝑑𝑑𝑑 𝑑𝑑ℎ𝑑𝑑 𝑦𝑦 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑

By substituting L = 2000 mm, W = 1200 mm, a = 200 mm and b = 200 mm, imax is determined to be 9 and jmax is determined to be 5. However, the number of turns is governed by the smaller of imax and jmax thus imax = jmax = 5. Therefore, SX is 9600 mm, SY is 3000 mm and the total distance travelled by the robot in spiral path is 12600 mm.

3.7.1.3 Summary The calculation results demonstrate that the robotic vacuum cleaner is able to save 15.6% of time to achieve a full room coverage when zigzag path is implemented. Apart from this, zigzag path is easier to design, implement and test compared to the spiral path.

W

L

a

b X

Y

Figure 13 Spiral path

14

Page 19: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

3.7.2 Further Developments 3.7.2.1 Moving Direction There are two possible options for the robot to perform a zigzag path in the room: (a) the robot moves forward along the direction parallel to the longer wall and makes turns when the robot sees the shorter wall; (b) the robot moves forward along the direction parallel to the shorter wall and makes turns when the robot sees the longer wall.

Option (a) has the drawback of accumulating the gyro drift and the error from motion controller which makes the moving path deviate from the original path parallel to the wall. Option (b) is possible to reduce the error by aligning to the wall each time it detects a wall but it makes the room coverage time longer as making turns slows down the robot.

The decision was made by assessing the performance of MPU and motion controller during the testing. Initially, the gyro drift was too large so that option (a) was not feasible. After adjusting the MPU, the motion controller and testing the robot, the gyro drift was minimised which made option (a) the better choice.

3.7.2.2 Movement after Detecting a Wall After determining the moving direction for the path, another decision was made on the type of movement after the shorter wall is detected. There are two options as well: (i) the robot makes either a clockwise or counter clockwise turn of 90o, moves forward for a short distance and then makes another turn of 90o in the same direction; (ii) the robot strafes to the left or right for a short distance after it detects a wall and then reverses back. Advantages and Disadvantages of both options are discussed as follows:

• Option (i) requires higher accuracy of the motion controller when making a 90o turn compared to option (ii).

• Option (i) is capable of performing obstacle avoidance by utilising the three IR sensors placed at the front of the robot whilst option (ii) is reluctant to perform obstacle avoidance during the strafing as only one IR sensor is placed on the side.

• Option (ii) is unable to detect the obstacle when it reverses back as it requires additional sensors facing towards the back of the robot.

The number of sensors are very limited but the accuracy of motion controller was improved during the testing which made option (i) the better choice.

3.7.2.3 Room Coverage Algorithm The room coverage algorithm is shown in figure 14. The room coverage starts from a corner of the room after the field detection stage and moves forward alongside the longer wall. As the robot makes counter clockwise turns in the field detection stage, the first turn in room coverage should be counter clockwise as well. A flag previous_turn is used in the codes to determine the direction of the previous turn completed by robot. When the robot detects a wall, it makes a 90o turn in the opposite direction to the previous turn. The distance of forward movement after the first 90o turn is determined by implementing a counter in the codes. When the robot detects a wall again shortly after making the first 900 turn, it shows the robot encounters the corner. This makes the next sweep along the longer wall the final sweep and the robot stops after it detects a wall in its

15

Page 20: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

final sweep. The final path of movement of the robot for room coverage is shown in figure 15. Obstacle avoidance of room coverage is discussed in Section 3.8.

Figure 14 Algorithm for room coverage

16

Page 21: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

3.8 Obstacle Avoidance The Obstacle Avoidance subsystem has the main objective of avoiding any obstacle or boundary (wall) without collision at any time during the run. Within each stage of the run, Obstacle Avoidance was modified to suit each case and produce the most efficient path to evade the obstacle detected.

Figure 16 is a block diagram to encapsulate the subsystem of Obstacle Avoidance.

Figure 15 Path of movement for room coverage

17

Page 22: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

Figure 16 Obstacle avoidance algorithm

18

Page 23: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

3.8.1 Initial Concepts Initially many things were considered and discussed as a solution to obstacle avoidance.

One concept was to curve the robot movement around the obstacle, however this would make the calculations and detection difficult to implement.

Another concept was to implement the use of fuzzy logic. Although highly effective if done correctly, fuzzy logic adds an unnecessary complexity to the system.

The final decision was to implement a system that was simple yet robust. Including only the use of moving forward or backward and horizontally. The system developed was then simpler to modify and manipulate to certain situations.

3.8.2 General Algorithms To depict and explain how the robot will handle each case of Obstacle Avoidance that occurs, figures of the algorithms are shown below with the movement of the robot presented in black.

The detection of an obstacle or a boundary is detailed in the following section in the report.

General Obstacle Avoidance Assuming there is ample space around an obstacle there is a general algorithm the robot follows to avoid an obstacle without collision.

In this case (see figure 17) the obstacle is detected in front of the robot, once the obstacle has been seen, the robot will traverse horizontally (left or right depending on the detection, explained in later section of report) until cleared. Then the robot will move forward clearing the obstacle in this direction as well. When the obstacle has been cleared in both directions, the robot will return to the original horizontal position located before the obstacle was detected. An example of this algorithm can be seen in the figure below.

Figure 17 General obstacle avoidance

Obstacle with Boundary Avoidance When the general case of Obstacle Avoidance is combined with a boundary on either side of the robot the algorithm will have a slight modification.

With all detection positions of the obstacle being inclusive, the robot will traverse horizontally away from the boundary until the obstacle is cleared. The robot then will follow the general

19

Page 24: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

algorithm stated previously; clearing the obstacle moving forward, before returning to the original position located before obstacle detection.

This case is most likely to occur in both the Field Detection stage and at the beginning and end of Room Coverage. However this can also occur within the general case of obstacle avoidance; whilst the robot traverses horizontally to clear the obstacle a boundary may be detected, this algorithm will then override the general case to avoid obstacle.

Figure 18 and Figure 19 demonstrate this algorithm with a boundary on the right and then with a boundary on the left of the robot respectively. Once again, detection of a boundary is explained in the next section of the report.

Figure 18 Obstacle avoidance with boundary on the right

Figure 19 Obstacle avoidance with boundary on the left

Boundary Avoidance When a boundary is detected by the robot there is a sequence of actions taken to avoid collision with the boundary. This case of Obstacle Avoidance occurs in the Field Detection stage and in the general case of Room Coverage.

Firstly the robot will reverse enough distance to prepare the robot for rotation without impact with the boundary. Once enough distance has been cleared the robot will then rotate 90 degrees to move forward along the boundary a sufficient distance to cover a new section of the room. Finally to continue the Room Coverage of the run, the robot will then rotate 90 degrees once more before continuing forward on the new section of the room. An example of this sequence can be seen below in Figure 20.

20

Page 25: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

Figure 20 Boundary avoidance with anti-clockwise rotation

The direction of rotation is determined based on the previous rotation of the robot. Here current rotation will always be in the opposite direction to that of the previous rotation. Further details of this algorithm are explained the Room Coverage section of the report.

Figure 20 and Figure 21 depict this algorithm with an anti-clockwise rotation and then with a clockwise rotation respectively. Details of how a boundary is detected are mentioned in the next section of the report.

Figure 21 Boundary avoidance with clockwise rotation

Boundary with Obstacle Avoidance Cases where boundary avoidance is interrupted by a detection of an obstacle were considered and algorithms were developed to solve these issues.

Figure 22 below displays the situation where the robot has detected an obstacle whilst moving forward along the boundary. Here the robot is programmed to traverse horizontally away from the boundary until the obstacle is cleared. Once cleared the robot will resume the algorithm for boundary avoidance and move forward to the new area of coverage. Again the robot will then rotate to continue forward across the room.

21

Page 26: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

In this case of Obstacle Avoidance a small section of area behind the obstacle may not be covered. However, an unnecessary amount of time would be taken to command the robot to navigate to this minor area and return to the desired path.

Figure 22 Boundary avoidance followed by obstacle avoidance

Figure 23 below displays the situation where the robot has detected an obstacle directly in front of a boundary. Here the robot will initially only detect that an obstacle is in the desired path and will perform the general case of Obstacle Avoidance. The robot is programmed to traverse horizontally until the obstacle is cleared, however a boundary will then be detected immediately after. To now avoid the boundary the robot will rotate 90 degrees twice before continuing forward across the room.

Again here, a small section of area near the obstacle may not be covered. With the same logic an unnecessary amount of time would have to be sacrificed to cover this minor area of the room.

Figure 23 Obstacle avoidance followed by boundary avoidance

22

Page 27: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

3.8.3 Implementation of Sensors Detection of an obstacle or boundary is the initial step and an integral part of the Obstacle Avoidance subsystem. With the use of the sensors placed on the robot, obstacles or boundaries could be detected to trigger the Obstacle Avoidance algorithms developed. Various cases were considered and described below.

Obstacle Detection Obstacles are detected by the difference in readings from the robots three front IR sensors. An obstacle is determined as “seen” if an IR sensors reading is read to be below a certain tolerance with other sensors reading as exceedingly above this tolerance.

With three IR sensors placed along the front of the robot an obstacle can be detected in one of three ways. In Figure 24 below, all three cases can be seen.

Firstly an obstacle can be “seen” potentially by only one of the IR sensors; S1 – front-left or S2 front-right. Here the robot will traverse horizontally in the direction with the shortest path to clear the obstacle, i.e. right when S1 is triggered the robot will strafe to the right, when S2 is triggered the robot will strafe to the left.

In the same manner an obstacle can also be detected by only S3, the front-middle IR sensor. Here, the direction the robot will traverse is trivial. To simplify the algorithms the robot will always strafe to the left, using IR sensor S4 – side-right to clear the obstacle (explained in further detail in a later section of the report).

The last case of obstacle detection is when the obstacle is “seen” by a combination of two sensors; S3 and one of the other IR sensors placed face in the front of the robot. Again as with the previous case the robot will always strafe to the left despite the combination of sensors triggered.

Figure 24 Obstacle detection

Boundary Detection There are three case of boundary detection than need to be considered with the developed system. Figure 25 show all three of these cases.

A boundary on the right of the robot is detected if the IR sensor S4 – side-right, reads below a certain tolerance.

23

Page 28: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

Boundaries in the front of the robot are detected by comparing the readings from all three of the front facing IR sensors. A boundary is determined as detected when all three IR sensors are read to be below a certain tolerance.

Similarly a boundary on the left of the robot is detected if the sonar sensor, reads below a certain tolerance.

Figure 25 Boundary detection

Clearing an Obstacle To determine if an obstacle is cleared the use of the IR sensors, S3 – front-middle and S4 – side-right are used. Figure 26 shows the two cases of clearing an obstacle.

With every obstacle detected, S3 is rotated 90 degrees anti-clockwise with a servo motor and altered from detecting an obstacle to being used as a sensor to clear the obstacle. Once the obstacle is cleared or if the obstacle detection algorithm has been interrupted, the servo motor will rotate S3 back to the original position to scan for obstacles.

Initially the obstacle is detected on the left when the reading is below a given tolerance, then once the reading is again exceedingly above this tolerance this means the obstacle is cleared. Because S3 is placed near the front of the robot. There is a certain amount of time before the robot in reality clears the obstacle. As a solution a counter has been implemented to compensate for this lag; explained further in the next section of the report.

When the obstacle is detected on the right of the robot the same logic applies. The only difference here is the use of S4 instead of S3. Also with S4 being place near to the back of the robot there is no need for a counter to lag the clearing of the obstacle.

24

Page 29: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

Figure 26 Clearing obstacle on the side

3.8.4 Implementation of Code To implement this develop system with code a combination of conditions, flags, counters and tolerances were used. The main technique used was digital logic.

To code the detection of an obstacle or boundary conditions within if statements were used to indicate the different cases of Obstacle Detection.

Along with these conditions flags were also used to indicate at which point of the specific Obstacle Avoidance algorithm the robot was in and to trigger the next stage in the sequence. These flags were also used as a safety and robust factor to ensure stages were not interrupted by any unwanted signals.

Counters were used in certain aspects of the Obstacle Avoidance algorithms to delay a certain actions for a predetermined amount of time. The main examples in which counters were used are in any time the robot is to reverse and clearing an obstacle horizontally then later returning to the previous position. The reverse counter is a fixed counter however when clearing an obstacle the counter is truly dependant on how long the robot has moved to clear an obstacle. This variable counter is achieved with the use of the millis() function which states the exact time in milliseconds that command is executed from when the entire program was started.

Tolerances were the trivial but crucial part of the code that affected the performance of the Obstacle Avoidance subsystem. This determined whether or not an obstacle or boundary was actually avoided or if the robot would encounter an impact. To adjust these tolerances appropriately extensive testing throughout the development of this subsystem was undergone.

25

Page 30: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

3.9 Localisation and Mapping The robot estimates its position using dead reckoning and its gyroscope readings. Its position is updated at intervals of 100ms. However this has to be polled because the interrupt timers are occupied by PWM outputs, so the elapsed intervals would vary within the time it takes for the main loops to execute. The actual elapsed time is measured and used to obtain the estimated distance. The mapping function uses the current motion state (one of forward, reverse, left, and right) to obtain the estimated travelled distance in reference to its own coordinate frame. It then multiplies this by a rotation matrix to obtain the actual distance with respect to the frame of the map and adds it to the position estimate. The angle of the rotation matrix is the difference of the gyroscope reading from the zero-angle of the map frame which is assigned when the robot has aligned itself to a wall.

𝑅𝑅𝑚𝑚𝑚𝑚𝑚𝑚𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟 = �

cos𝜃𝜃 − sin𝜃𝜃 𝑥𝑥sin𝜃𝜃 cos𝜃𝜃 𝑦𝑦

0 0 1�

(5)

This is the homogeneous matrix describing the complete transformation, including both translation and rotation, from the robot’s coordinate frame to the map’s coordinate frame. Here x and y are the robot’s coordinates on the map and theta is calculated as follows:

𝜃𝜃 = 𝑑𝑑𝑛𝑛𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑 𝑎𝑎𝑑𝑑𝑙𝑙𝑡𝑡𝑑𝑑 𝑑𝑑𝑑𝑑𝑎𝑎𝑑𝑑𝑑𝑑𝑑𝑑𝑙𝑙 − 𝑧𝑧𝑑𝑑𝑑𝑑𝑑𝑑 𝑎𝑎𝑑𝑑𝑙𝑙𝑡𝑡𝑑𝑑 𝑑𝑑𝑑𝑑 𝑛𝑛𝑎𝑎𝑐𝑐 𝑑𝑑𝑑𝑑𝑎𝑎𝑛𝑛𝑑𝑑

(6)

The robot sends its coordinates at 500ms intervals to its Bluetooth serial port, encoding them with the prefix “pos” followed by space-separated values of the coordinates. When the robot finishes the initial-mapping process it sends the estimated dimensions of the room to the serial port, encoding them with the prefix “map” followed by space-separated values of the length and width. Upon detecting an obstacle, the robot estimates the obstacle’s displacement relative to its own coordinate frame, taking into consideration whether the obstacle was detected by its left, right, or centre IR sensor, multiplies this by the same rotation matrix as used in position estimation, and adds it to the robot’s position to obtain the estimated position of the obstacle. It then sends this to the serial port, encoding it with the prefix “obs” followed by space-separated coordinates.

An app was written in C# and Windows Presentation Foundation to monitor the serial port and graphically represent mapping information sent by the robot. It uses the open source WPF graphing package Dynamic Data Display to draw the map with line graphs. It also displays the serial port and allows the user to obtain unrelated debugging information at the same time that it extracts the encoded mapping information.

The app does not plot anything until the robot sends the “map” command with the estimated dimensions of the room, signalling that it has finished initial-mapping and is ready to start proper room coverage. The app then draws the room with several points forming a rectangle with the specified dimensions.

Upon receiving a robot position update it adds the new coordinates to a line graph that contains all the robot’s previous positions, showing the robot’s path. To highlight the latest position it plots a triangle around the new coordinate and removes the previous highlighter plot. When it receives coordinates for an obstacle it plots a circle around those coordinates.

26

Page 31: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

Figure 27 Map produced from a full run

Figure 27 shows a demonstration run with the robot’s data mapped by the app. The robot avoids two out of three obstacles but runs into the third one causing it to detect the obstacle again producing an error that led it to move off to one side in an unresponsive state. This explains the vertical line going beyond the bottom edge as the robot continually tries to move when pressed against the wall. The lines sticking out from either side are from the robot reversing slightly each time after coming to a wall. Figure 28 shows the path of the robot as tracked by the camera. Note that it is up-down inverted, and includes the robot’s movements during field-detection (coming to a wall and travelling along the perimeter) which the app has omitted. Outside of these differences the two are reasonably close. The camera-tracked path suffers from fish-eye

27

Page 32: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

distortion and possibly innacurrate obstacle positions relative to the robot’s path because of differences between the robot’s height and the robot height at which the camera was calibrated.

Figure 28 Demonstration robot path recorded by camera

The position marker was out of view in the screenshot of the app. Figure 29 below shows the marker as it normally appears.

Figure 29 Green triangle marking the robot's current position

28

Page 33: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

4.0 Testing and Results 4.1 Sensor Testing In order to obtain accuracy, the sensors readings were taken at different distances relative to a still wall.

4.1.1 Initial Sensor Testing 4.1.1.1 IR Sensor Testing The relationship of medium range IR sensor S1 and S2 readings against actual distance measured is shown in figure 30.

Figure 30 Initial IR sensor performance

Plots on this graph are roughly straight lines, which indicates the sensors readings are relatively linear. However, the variation between sensor readings and actual values are very obvious. This suggests filters on sensor readings to be used to reduce the effect of high frequency noise.

Long-range IR sensor was also tested in the same way. The readings were very unstable, fluctuating around true value with approximately 8cm difference.

4.1.1.2 Sonar Sensor testing Sonar sensor has 3 different output methods:

• Analog • Pulse width • Digital

According to testing results, pulse width output was chosen, as it generates the most accurate reading out of the three methods.

0

10

20

30

40

50

60

18 20 23 25 28 30 33 35 38 40 43 45

Mea

sure

men

ts (c

m)

Actual Distance (cm)

IR Sensor Performance

Medium-Range 's1'

Medium-Range 's2'

ActuralMeasurement

29

Page 34: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

Figure 31 Sonar sensor measurements

As shown in figure 31, sonar sensor readings can be taken as a straight plot that almost intersects the actual distance line. This indicates sonar sensor has high accuracy and linearity. Filters may not be necessary. But it was also noticed, sonar sensor gave random readings at corners of the field, due to signal reflection.

Sonar sensor signal emits out in a cone shape with 100˚ emission angle. This means sonar sensor could be highly unstable, such as the scenario shown in figure 32 if there is an object on the detection area, the signal will be able to reach the target wall. The detected obstacle location also requires calculation to figure out its actual position relative to the robot, as the sonar signal is not a beam, the readings only indicate the distance but not the direction.

Analysis on the sensor performance based on range, linearity and stability were taken to decide which would be used and the analysis results are shown in table 1.

Table 1 Analysis on sensor performance

Range(cm) Linearity Stability Long Range 20-150' non-linear Low Med Range 10-80' linear High Short Range 4-30' linear Medium

Sonar 16-645 linear Low

Figure 32 Range of sonar sensor

30

Page 35: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

• Medium-range and short-range IR sensors are linear and stable, which means they are more stable and easier to calibrate.

• Long range IR sensor has a 20cm dead band, which could be problematic for obstacle detection.

Therefore medium range and short range IR sensors were chosen for obstacle detection

4.1.1.3 Testing on sensor performance with filter implemented In order to reduce the high frequency noise, the sensors readings were required to be filtered using software or hardware. A number of software filters including, simple moving average, exponential average and Savitzky Golay Filter were tried on the IR sensors and tested again using the same method mentioned above to compare whether the filter had improved their performance. These filters were included in a smoothing library called Microsmooth which is designed for Arduino platform [4].

Figure 33 IR sensor performance after the implementation of filter

Simple moving average filter generated the most accurate readings. Figure 33 shows obvious error reduction after the filter was incorporated; the readings have high accuracy when the sensors are less than 30 cm away from obstacle. Refer to appendix C for testing results/plots using the other two filters.

4.2 Field Detection Testing During the initial testing of the field detection algorithm, the main issue was found to be the drift of the robot when it moved forward along the wall. This was due to the slightly unstable and inaccurate MPU reading as well as the unsatisfied motion controller. After calibrating the MPU and tuning the motion controller, the drift was largely minimised. The other issue was the collision

05

101520253035404550

16 21 26 31 36 41

Mea

sure

men

ts (c

m)

Actual distance(cm)

IR Sensor Performance with Moving Average Filter

Actural DistanceMeasurements

Medium-IR reading 's1'

Short-IR Reading

Medium-IR reading 's2'

31

Page 36: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

between the walls the robot which was solved by implementing a reverse function after the wall was detected by the robot.

4.3 Room Coverage Testing During the testing of room coverage algorithm, the main issue was found to be the frequent collisions between the walls and the robot. This was because the fast speed of the robot compared to the relatively slow updating speed of the sensor reading deteriorated the reaction time of the robot. The action taken was to implement a reverse function after the detection of a wall and this method was proved to successfully eliminate the collisions.

4.4 Obstacle Avoidance Testing 4.4.1 General Obstacle Avoidance Once the sensors were placed Obstacle Avoidance testing began with actually detecting an obstacle. To begin with the two front facing sensors on either side (S1 and S2) were used to test the tolerance and triggering of detecting an obstacle and moving horizontally to avoid collision. Before the implementation of counters, this movement was for a set amount of time. When detection and triggering was tested and modified to a satisfactory performance S3 was included and tested. Tests were done to achieve the correct direction response and to tune the tolerance of the detection depending on the robots response.

When detection became satisfactory, the fixed amount of time for the horizontal movement of the robot was then modified and tested to a reliance on a trigger from an IR sensor reading clearance. This would also test the output of a flag to trigger the forward movement of the robot after this clearance.

Again at first the forward movement after clearing the obstacle horizontally was for a fixed amount of time. Following this delay, this would trigger again at first, a fixed return movement back in the direction of the obstacle. Using these constant delays was to test the movement and response of the robot to the commands given.

The next stage in testing was to replace these constant delays with appropriate sensor reading signal processing. This is to ensure the development of this system could be applied to any size obstacle.

A counter was implemented along with sensor readings to determine how far the robot needed to return to the original path. Sensor readings were also used to clear an obstacle appropriately when the robot was moving forward. Here the modification of the inclusion of a servo motor to rotate the IR sensor for obstacle clearance was made. Testing here was to improve the tolerances of the sensors so that the stages were triggered appropriately.

Common issues that arose were tolerance and trigger issues. Tolerances were made either to large or to small and detection of the obstacles would fail. Other tests exposed the errors in logic or conditional statements where the robot would not trigger the next stage correctly in the algorithm.

32

Page 37: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

4.4.2 Boundary Avoidance Again the first thing to be tested was the tolerances applied to actually detecting a boundary in front of the robot. This was to ensure the robot would not collide and be ready to take the appropriate sequence of actions to avoid this boundary.

Once detection was achieved, the correct sequence of actions were developed. This is the inclusion of the 90 degree rotation, the movement forward, another 90 degree rotation, and the continuation of movement forward across the room. Testing was done to adjust the amount of movement forward to cover a new area of the room and to tune the rotation of the robot. Here the rotation of the robot was tested thoroughly to reduce the overshoot and achieve the desired direction commanded.

When the sequence of actions was developed to a satisfactory level there were still problems with boundary collision when the robot performed the rotations. The solution devised was to reverse the robot just enough to clear rotation when a boundary was detected. The testing involved here was to adjust how much distance was needed and to ensure the robot could trigger the appropriate actions after the reverse.

4.4.3 Distinctive Cases Once general obstacle and boundary avoidance was developed cases where the two were interconnected had to be considered. Here many alternative methods and solutions were trialled and improved, evolving to the algorithms explained previously. Testing here was to see what method was the most efficient as well as to again ensure that the robot did not get caught in any stage of the algorithm or cause any false triggers.

5.0 Discussion and Future Improvements The trial run shows a satisfactory performance of Room Coverage, Obstacle Avoidance and the Localisation Mapping of the robot. The robot could align to wall properly without collision or angle deviation. It could also map the field with accurate dimension sizes. During room coverage, the robot could travel in a planned path and avoid most obstacles with very small amount of irregular/unexpected behaviour. It could also achieve 80% of room coverage within 1 minute and 30 seconds. The generated path map is able to achieve above 90% similarity to the real path of the robot during the demonstration.

The one major problem in this system was avoiding obstacles near a boundary. On more than one occasion in testing of this situation, the robot was able to detect the obstacle and execute obstacle avoidance. But, it would then immediately detect the obstacle again, and get stuck in moving horizontally away from the boundary. Testing and modifications are to be made to correct the error in obstacle avoidance mentioned above.

It was noticed during trial run, that the robot crashed into the wall every time when executing stop function. In order to prevent robot from colliding to the wall when executing ‘stop’, the trigger distance should be increased to a safe value.

Existing IR sensors can be replaced by more accurate laser sensors, i.e. laser sensors with a mirror that can rotate 180 ° and read at every angle. This could ease the process of

33

Page 38: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

programming and installation; and increase reliability of the sensor readings, as well as reduce number of sensors used.

These future implementations could vastly improve the performance of the robot.

6.0 Conclusions • A SLAM robotic vacuum cleaner was designed and built which incorporated several

sensors, a MPU and a chassis and an Arduino Microcontroller. • Sensors were placed, calibrated and tested in order to achieve the best performance. • MPU was installed and calibrated for the purpose of implementing the motion control. • Initialisation, field detection, room coverage and obstacle avoidance are the main states

of the program. • Data collected from field detection, room coverage and obstacle avoidance are used to

build the map of the field and simultaneously localise the robot. • Testing was done for each state and the overall system to improve its performance • The final run was completed within 1 minute and 30 seconds and achieved around 80%

of room coverage with the robot colliding with two obstacles and a wall. • Future improvements such as modifying the stop function and replacing the IR sensors

with more accurate laser sensor could be made to achieve better performance.

34

Page 39: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

7.0 References [1] C. Wang and C. Thorpe, Simultaneously Localisation and Mapping with Detection and Tracking of Moving Objects, IEEE Int. Conf. on Robotics and Automation, May. 2003.

[2] Y. W. Bai and M. F. Hsueh, Using an Adaptive Iterative Learning Algorithm for Planning of the Path of an Autonomous Robotic Vacuum Cleaner, IEEE Int. Conf. on Consumer Electronics, 2012.

[3] K. M. Hasan, A. Al-Nahid and K. J. Reza, Path Planning Algorithm Development for Autonomous Vacuum Cleaner Robots, IEEE Int. Conf. on Informatics, Electronics & Vision, 2014.

[4] Microsmooth library. Retrieved June 2015 from: https://github.com/asheeshr/Microsmooth.

35

Page 40: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

Appendix A

A1

Page 41: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

Appendix B

B1

Page 42: Autonomous SLAM Robot - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/... · Autonomous SLAM Robot Project Report ... The robotic vacuum cleaner is able to

Appendix C

0

5

10

15

20

25

30

35

40

45

16 21 26 31 36 41

sens

or re

adin

gs (c

m)

Actual Measurement (cm)

IR Sensor Performance with Savitzky Golay Filter

actual measurements

Med range S1

Short rang s3

Med range S2

0

10

20

30

40

50

60

16 21 26 31 36 41

sens

or re

adin

gs (c

m)

actual distance (cm)

IR Sensor Performance with Exponational averaging Filter

Series1

Series2

Series3

Series4

C1