38
Tobias Looker, Keegan Brown, Kade Turner and Mark Walbran University of Auckland Faculty of Engineering [email protected] SLAM Robot Project

SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Tobias Looker, Keegan Brown,

Kade Turner and Mark Walbran

University of Auckland

Faculty of Engineering

[email protected]

SLAM Robot Project

Page 2: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 1

Date

Executive Summary This report details the design, development and testing of a Mecanum wheeled Simultaneous

localization and mapping robot. This robot was developed for the purpose of autonomous

vacuum cleaning of a room, whilst avoiding obstacles. Highlights of our design include its

simplicity, minimal hardware usage, modular software design and efficiency.

Unfortunately on the morning of assessment there were some hardware issues with the robot

and the robot did not perform as desired during demonstration. However, videos and camera

mapping of the robot have been provided, proving satisfactory performance of the robot the

day before demonstration and the night after.

Page 3: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 2

Date

Contents Executive Summary ........................................................ 1

Contents ......................................................................... 2

Figures ............................................................................ 4

Introduction .................................................................... 6

Robot Control ................................................................. 7

Motor Control ................................................................. 7

Trajectory Control .......................................................... 9

Wall Follow Control ........................................................ 9

Yaw Control .................................................................. 10

Control System Integration .......................................... 11

Position Control ............................................................ 12

Odometry ..................................................................... 12

Collision Detection ....................................................... 15

Blockage Identification ................................................. 16

Obstacle Detection ....................................................... 16

Scanning ....................................................................... 16

Geometry calculations .................................................. 17

Obstacle Avoidance ...................................................... 19

X Movement ................................................................. 19

Y Movement ................................................................. 20

Wall obstacle case ........................................................ 20

Movement arrays ......................................................... 20

Finite State Machine ..................................................... 22

Page 4: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 3

Date

Initializing ..................................................................... 23

Start .............................................................................. 23

Running ......................................................................... 25

Collision Detected......................................................... 25

Avoid Collision .............................................................. 25

Stop ............................................................................... 25

Code Structure .............................................................. 25

Sensors ......................................................................... 27

MPU .............................................................................. 27

Infrared Proximity Sensor ............................................. 29

Sonar ............................................................................. 30

Scanning ....................................................................... 31

Sensor Class .................................................................. 31

3D Design ...................................................................... 32

Mapping........................................................................ 33

Overview ....................................................................... 33

Algorithm ...................................................................... 33

Conclusions ................................................................... 35

References ...................................................................... 0

Page 5: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 4

Date

Figures Figure 1 - All motors RPM to PWM relationship ............................................................................. 7

Figure 2 - Linear speed to RPM relationship ................................................................................... 8

Figure 3 - Full forward and reverse PWM to RPM relationship ...................................................... 9

Figure 4 - Spiral algorithm path ..................................................................................................... 10

Figure 5 - Wall following control system diagram ......................................................................... 10

Figure 6 - Yaw control system diagram ......................................................................................... 11

Figure 7 - Control system integration depiction............................................................................ 11

Figure 8 - Position control system diagram ................................................................................... 12

Figure 9 - Global Vs Local Co-ordinate system diagram ................................................................ 13

Figure 10 - Odometry without complementary filter ................................................................... 14

Figure 11 - Odometry without corner creep adjustment .............................................................. 15

Figure 12 - Odomety with complementary filter and creep adjustment ...................................... 15

Figure 13 Obstacle Scan ................................................................................................................ 16

Figure 14: Obstacle diameter diagram .......................................................................................... 17

Figure 15: Local X and Y position offset diagram .......................................................................... 18

Figure 16: Local X and Y obstacle avoidance path ......................................................................... 21

Figure 17 - FSM Diagram ............................................................................................................... 22

Figure 18: 4 stages of initial positioning in the start state ............................................................ 24

Figure 19 - Class Diagram .............................................................................................................. 26

Figure 20 SMD Decoupling Capacitor ............................................................................................ 29

Figure 21 Correct Sensor Orientation ........................................................................................... 29

Figure 22 Long Range IR Calibration .............................................................................................. 30

Figure 23 Short Range IR Calibration ............................................................................................. 30

Figure 24 Scanning Head Photo Interrupter.................................................................................. 31

Figure 25 Photo Interrupter Circuit ............................................................................................... 31

Figure 26 Stepper mount Design Revisions ................................................................................... 32

Page 6: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 5

Date

Figure 27 FStepper Mount CAD ..................................................................................................... 32

Figure 28 Scanning Head Design Iterations ................................................................................... 32

Figure 29 Final CAD Design ............................................................................................................ 33

Figure 30 Final Robot Hardware Design ........................................................................................ 33

Figure 31 Final Robot Path When Operational.............................................................................. 36

Page 7: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 6

Date

Introduction This project aimed to develop an autonomous robot capable of mapping its environment and

plotting its path in this environment; commonly known as a SLAM robot. This problem is a

difficult one to solve, since reliable position information requires good map data, and vice versa.

Since we begin with neither, any errors in measurements can easily be compounded and a

positive feedback situation can occur, with increasingly poor data.

Our robot uses two IR sensors (one mounted facing to the right and one mounted to a stepper

motor shaft) to receive information about the robot’s surroundings. Position information is

calculated from odometry of wheels, and the orientation of the robot measured by a 9-axis

MPU. This information is processed by the Arduino sketch, and instructs the robot to explore its

surroundings, covering all open ground and avoiding obstacles.

The critical tasks were identified (motor control and movement, sensor management and

scanning, map generation and presentation). These tasks were then split between the members

of the group, and were constructed separately during the early stages. Some of the big problems

that arose came from code execution timing, keeping data current across all modules, and

dealing with measured data and the noise associated with sensor readings.

Page 8: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 7

Date

Robot Control

Motor Control During the initial stages of testing the provided robot, it became clear that there were significant

differences between each of the motors speeds when provided with the same PWM signal.

Furthermore the speed changes of the motors appeared non-linear to the increases in the PWM

signal for each motor which caused further complications with the initial motion of the device.

To account for these discrepancies and investigate the actual relationship between PWM and

the motors speed a series of tests were conducted using a hand held tachometer. From the

results an equation for the relationship between PWM and motor speed for each motor in both

directions was obtained and as suspected the relationships came out as an exponential as

shown below.

Figure 1 - All motors RPM to PWM relationship

To form a precise representation of the motor characteristics a 4th order polynomial was used

instead of the exponential. After obtaining a relationship for the forward and backward motion

of each motor, the equations were implemented into the code. Instead of directly writing PWM

values to the each of the motors, speed values were used. These speed values through the

equations obtained yielded PWM values to achieve the desired speed. This resulted in more

predictable and desired motion of the robot and the ability to vary the robots speed linearly.

To prove that the motor equations obtained resulted in the linear speed change of the robot

another series of tests were conducted. The robot was driven in all 4 main orientations for 1.5

meters at a series of speeds for each motor ranging from 35 to 115 RPM. The time to complete

these paths was recorded and used to obtain the cars average velocity over the trajectory. The

resulting data showed a linear relationship for each direction of travel verifying that the 4th

order motor equations work as required.

0

100

200

300

400

500

600

0 20 40 60 80 100 120 140

All Motors

Page 9: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 8

Date

Figure 2 - Linear speed to RPM relationship

As shown above this testing verified the initial linearization from PWM to RPM was correct. The

equation obtained through this testing was essentially the same as what would be obtained

through theoretical RPM to linear velocity conversion. However creating this relationship

through theoretical means would mean the slip and other correction factors would be left out

so it was determined that a more direct means of calculation would yield more accurate results.

Once the motor speeds of the robot were converted to the required linear speeds inverse

kinematics were applied to the overall X and Y velocity vectors to equate the components

required from each motor. The resulting equations are shown below.

𝑉𝑀𝑜𝑡𝑜𝑟_𝑇𝑜𝑝_𝐿𝑒𝑓𝑡 = 𝑌𝑠𝑝𝑒𝑒𝑑 + 𝑋𝑠𝑝𝑒𝑒𝑑 − 𝑌𝑎𝑤𝑠𝑝𝑒𝑒𝑑

𝑉𝑀𝑜𝑡𝑜𝑟_𝑇𝑜𝑝_𝑅𝑖𝑔ℎ𝑡 = −𝑌𝑠𝑝𝑒𝑒𝑑 + 𝑋𝑠𝑝𝑒𝑒𝑑 − 𝑌𝑎𝑤𝑠𝑝𝑒𝑒𝑑

𝑉𝑀𝑜𝑡𝑜𝑟_𝐵𝑜𝑡𝑡𝑜𝑚_𝑅𝑖𝑔ℎ𝑡 = 𝑌𝑠𝑝𝑒𝑒𝑑 − 𝑋𝑠𝑝𝑒𝑒𝑑 − 𝑌𝑎𝑤𝑠𝑝𝑒𝑒𝑑

𝑉𝑀𝑜𝑡𝑜𝑟_𝐵𝑜𝑡𝑡𝑜𝑚_𝐿𝑒𝑓𝑡 = −𝑌𝑠𝑝𝑒𝑒𝑑 − 𝑋𝑠𝑝𝑒𝑒𝑑 − 𝑌𝑎𝑤𝑠𝑝𝑒𝑒𝑑

The structure of the resulting motor control system consisted of two main classes, Motor and

MotorController. For each of the 4 robot Motors, a Motor class was instantiated. This class

contained the equation representing the motion of the motor in the forward and reverse

direction. A desired speed is sent to this class which is converted by the equation into the

required PWM signal. The MotorController class then sent the desired speed commands to the 4

instances of the motor class, Motorcontroller received commands from the higher level control

in the form of an X, Y and Yaw speed. These commands were then converted into the required

speeds from each motor.

y = 0.0026x + 0.017

0

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0 20 40 60 80 100 120 140

Forward Velocity

Page 10: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 9

Date

Figure 3 - Full forward and reverse PWM to RPM relationship

Trajectory Control This controller was the first implemented above the motor controller, the purpose of this layer

of control was used to instruct the robot on how to move. The inputs to this controller were X

and Y speeds. These commands were handed down to the next level of control called motor

controller which converted the speeds into the required speeds from each motor. A speed

factor from this level of control was handed back, this factor scaled the speeds to ensure that

over saturation of one or more of the motors didn’t occur. This controller was integrated with

wall follow control and yaw control explained below through a prioritization system.

Wall Follow Control

The chosen approach for the robot to complete the map was through a spiral pattern starting

from a corner and spiraling inward. To accomplish this method a combination of a wall following

control system using a side mounted medium range IR sensor and a yaw controller using the

MPU was implemented. Once the robot was re-initialized after orientating in a corner the yaw

and wall follow control systems were enabled. The original yaw set point was 0 which held the

robot straight, the wall follow set point was 50 which held the robot far enough away from the

wall that it was at no risk of hitting the wall. Using the collision detection system with the

permanently front mounted long range IR sensor and the rotatable long range IR sensor. When

the sensors detected something in front the system could distinguish if it was look at a wall or

an obstacle (explained later). With this ability when the robot reached what it identified as a

wall the yaw set point would decrease by 90 degrees. After rotating a total of 270 degrees

-150

-100

-50

0

50

100

150

-600 -400 -200 0 200 400 600

Tacho RPM

Page 11: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 10

Date

clockwise when the robot next identified a wall not only would the yaw set point adjust but the

wall follow set point would increase by 1 width of the robot.

Figure 4 - Spiral algorithm path

Figure 5 - Wall following control system diagram

Yaw Control

To ensure the robot moved in straight lines as required a yaw control system was implemented,

this proportional controller received a set point and a reference signal in the form of the MPU’s

current yaw value. These two provided the error signal which was then when processed through

a proportional controller that outputed the required yaw speed of the robot. Problems initially

arose when the robot was travelling and correcting dynamically because sometimes motors at

the lowest control level were being sent the product of a yaw speed and x or y speed that when

Start

Position

End

Position

+

-

Kp Motors Wall Setpoint Wall Error Local X speed Robot X speed

IR sensor

sensor

Measured wall distance

Page 12: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 11

Date

added was unachievable by the motor. This resulted in the oversaturation of the motor and

therefore would cause the motor to stop. To correct this problem a speed factor was introduced

at this level which meant when a motor was sent an invalid speed it would return a scale factor

that reduced the speed to an obtainable level. This factor was then applied to the other 3

motors to ensure the instructed overall motion was still achieved.

Figure 6 - Yaw control system diagram

Control System Integration

During the main execution of the robot through the map a combination of the yaw controller,

wall follow controller and trajectory control was required like in the instance shown below.

Figure 7 - Control system integration depiction

+

-

Kp Motors Yaw Setpoint Yaw Error Yaw speed Robot Yaw

MPU Measured Yaw

Yaw

Setpoint

Wall

Setpoint

MPU

Heading

Right IR

Reading

Forward

Trajectory

Page 13: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 12

Date

When these two controllers were initially combined problems arose because the control

systems attempted to compete with each other. To prevent this the following priority system for

the controllers was established.

1. Yaw Controller

2. Wall Follow Controller

3. Trajectory Controller

When the error of the yaw controller was greater than 10 degrees it disabled the trajectory and

wall follow control system until the error was reduced sufficiently. When the yaw error was

within a tolerable range but the wall follow controller had an error greater than 10. The

trajectory controller was disabled until this error was reduced. The results of this integration

meant the robot moved in a much more controlled fashion.

Position Control Once the three controllers explained above were combined another level of control was added

to enable the robot to move to instructed positions. This functionality was achieved through an

additional method in direction controller which called the existing method that combined the

three other controllers. Shown through the control system diagram below the method reported

back to what state called it either achieving set point or set point achieved. This distinction

allowed the logic calling this method to know when to send the robot a new positional set point.

Figure 8 - Position control system diagram

Odometry Above the motor control logic is the DirectionController class, this object takes in the desired

speeds in the X and Y and the desired direction. The class then instructs MotorController on how

to achieve the required trajectory by using the MPU as a sensor. In addition to this the

DirectionController class also returns the current position each time it is called. The class

estimates the current position by first using the equations obtained in the second part of testing

to convert the motor RPM speeds into the current x and y velocity of the robot. This resulting

figure is then integrated with time to gain the required positional information. Through tuning

+

-

Kp Motors Position

Setpoint

Position Error Direction Speed Robot Speed

Odometry Estimated Position

Page 14: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 13

Date

the equations with a correction factor, odometery within 2cm was achieved. In order for this

information to be useful to the map function the position information being generated needed

to be converted to a global co-ordinate system rather that a set of axes set to the robot.

Figure 9 - Global Vs Local Co-ordinate system diagram

To achieve this the direction control class was expanded to take in the current MPU reading, by

having this information the function could then determine how the local x and y position change

components belonged to global position vectors. To ensure the global co-ordinate axis were

orientated with the map correctly, after the initiation of the robot to find a corner of the map

the MPU was re-initiated setting the heading to zero.

Without limiting the possible speeds in the x and y of the robot unnecessarily the

implementation of a speed factor was used to ensure none of the motors saturated and

therefore stopped but also allowed the action to be performed at the max possible speed if

desired. When the x, y and yaw speed are passed to the MotorController class they are

converted into the required speeds of each motor to achieve the motion. Before converting this

RPM value to PPM, the value is checked to ensure it is not higher than the max RPM speed

allowed by the motor (100RPM). If the required speed of the motor is above this, a factor is

calculated to scale the motors value to 100. This factor is then applied to the other 3 motors to

ensure that the resulting motion is what was instructed. This factor is then passed back out of

the MotorController class to instruct the DirectionController class what the actual x, y and yaw

speeds achieved were to ensure the estimate of position performed by this class is accurate.

As shown in the above motor characteristic equations each motor featured a dead band of

around 25 RPM. To prevent creep from this source of error the positional estimating module

ignored any speeds below 25 RPM treating them effectively as zero speed.

Y

X

X

Y

Global coordinates defined by start position

Local coordinates fixed to robot

Page 15: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 14

Date

After the raw tunning of the odometry results such as that shown below were achieved where a

moderate amount of creep occurred which accumulated as the robot ran. To cure this problem

and improve the odometry a complimentary filter was added to position estimation using the IR

sensors. Every time the direction control class updated the robots position in a particular

direction the robot checked the relevant IR sensor. If this sensor was in range and sampled a

valid position for both the current and previous iteration the difference in the sensor reading

was combined via a complementary filter with the linear speed integration.

Figure 10 - Odometry without complementary filter

The results of the addition of a complementary filter eliminated the creep in the odometry as

the robot ran in straight lines along a wall. Below is the results the robot provided from a full run

of the map without obstacles. The final problem that was identified was creep occurring at the

corners as the robot rotated to re aligned with a wall and continue wall following. This error is

shown in the graph below through the corners showing a shift in the y when in reality the robot

was rotating on the spot. To eliminate the error the controller prioritization discussed in the

control system integration section was linked to the position estimation module. This allowed

the position estimation method the ability to know definitively the robots current motion and

consequently eliminated this problem.

𝑃𝑜𝑠 𝐶ℎ𝑎𝑛𝑔𝑒 = (∫ 𝑉(𝑡) ∗ 𝑑𝑡 𝑡

𝑡−1

) ∗ 𝑉𝑒𝑙𝑜𝑐𝑖𝑡𝑦𝑊𝑒𝑖𝑔ℎ𝑡𝑖𝑛𝑔 + (𝐼𝑅(𝑡) − (𝐼𝑅(𝑡−1)) ∗ (1 −

∗ 𝑉𝑒𝑙𝑜𝑐𝑖𝑡𝑦𝑊𝑒𝑖𝑔ℎ𝑡𝑖𝑛𝑔)

Where

𝑉𝑒𝑙𝑜𝑐𝑖𝑡𝑦𝑊𝑒𝑖𝑔ℎ𝑡𝑖𝑛𝑔 < 1

-200

0

200

400

600

800

1000

1200

-2500 -2000 -1500 -1000 -500 0

Odometry without complementary filter

Page 16: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 15

Date

Figure 11 - Odometry without corner creep adjustment

The final odometry is shown below with the robot completing the map including the avoidance

of an obstacle. The small amount of creep that is visble of the final loop of the robot was due to

the gyro on the MPU drifiting which accumulated and by the time the robot was in the center of

the map the motion of the robot was slightly offset.

Figure 12 - Odomety with complementary filter and creep adjustment

Collision Detection The approach to collision control involved using the stepper motor located on the center of the

robot that feature a sonar and IR sensor mounted to it. During execution of the main running

state a function inside the CollisionDetection class called handlecollision was called each cycle to

check for anything in the robots immediate path. If something was detected the robots FSM

then transitioned into a state called collision detected, once in this state another function inside

-1200

-1000

-800

-600

-400

-200

0

-100 400 900 1400 1900

Odometry without corner creep adjustment

-1100

-900

-700

-500

-300

-100-50 450 950 1450 1950

Odomety with complementary filter and creep adjustment

Page 17: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 16

Date

the CollisionDetection class was called to determine if what is blocking the robots path is a wall

or an obstacle. To make this distinction the function reviewed the data from the sonar and IR’s

last scan with the stepper motor. If the distances in the array showed uniform change for

example there were no large changes in distance values then the function deduced that it was

looking at a wall. If the detection was deemed to be a wall and the robot was in the

mapping/running mode as opposed to initialize. The robot would then call a function from the

collisiondetection class to determine how to avoid the object. The strategy to make this

determination required 3 key pieces of information that could be extracted from the data array

received from the sweep of the sonar and IR sensor.

Blockage Identification

Once a blockage had been detected through the constant scanning of the stepper-mounted

Infrared sensor, it needed to be determined whether that blockage was an obstacle or a wall. To

accomplish this a function called IdentifyBlockage was called whenever a blockage had been

detected. This function performed a scan facing directly forward and then at ±15 steps (±27˚)

from forward.

If the difference between either the +27˚ and 0˚ or -27 ˚ the 0˚ distances is greater than the

obstacle tolerance (defined at 100mm) than the scanned blockage must be an obstacle. If any of

distances is out of range of the Infrared sensor (>1000mm) than the blockage must also be an

obstacle, as a blockage would not have been detected over 1000mm away from a wall.

Obstacle Detection

Scanning

Once the three point scan has determined that the blockage is indeed an obstacle the sensor

class is called and the stepper motor does a ±22 step (±40˚) scan with the infrared sensor

mounted on top. An example output from such a scan is shown as a radar plot in Figure 13

below. Distances are represented on the radial axis (in mm) with scan angle being represented

on the angular axis.

Figure 13 Obstacle Scan

Page 18: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 17

Date

The start and end positions of the obstacle (StartStep and EndStep) were determined through

searching the data and obtaining the array indexes where the positive and negative jumps in

distance occur.

There were initially issues when the walls behind the obstacle were out of range of the IR

sensors. This problem was fixed by counting a start point as either a negative jump in distance

or a jump from zero to a positive integer, the last start point detected was then taken as the

start point of the obstacle. End points were counted as either positive jumps in distance or a

jump from a positive integer to zero, the first end point detected was used to ensure that it was

the actual end point of the obstacle.

They were then saved and used later in the calculations along with the distance values obtained.

Geometry calculations

With the array of scan data available and the start and end positions of the obstacle recorded,

the next step was to calculate the diameter of the obstacle. Start Dist. was obtained as the

distance value at array index [StartStep] and with this information the obstacle diameter was

calculated. This can be seen demonstrated in Figure 14 and equations below.

Figure 14: Obstacle diameter diagram

𝜃 = (𝑆𝑡𝑎𝑟𝑡𝑆𝑡𝑒𝑝 − 𝐸𝑛𝑑𝑆𝑡𝑒𝑝) ∗𝜋

100

𝐷 = 2 ∗ 𝑆𝑡𝑎𝑟𝑡𝐷𝑖𝑠𝑡 ∗ sin (𝜃

2)

X

Y

θ/2

D

θ/2

Page 19: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 18

Date

Once the diameter (D) of the obstacle had been calculated, this could then be used along with

the offset angle (Ф) to calculate the X and Y offsets of the obstacle in the local co-ordinate

system (centred on the robot). This can be seen represented graphically in Figure 15 below.

Figure 15: Local X and Y position offset diagram

An array of distance values was passed back by the sensor class from the stepper-mounted IR

sensor. These values represented the distance from the centre of the robot to the nearest

blockage in millimetres. The IR reading were taken at intervals of 1.8˚ (1 step of the stepper

motor). The distance to the closest edge of the obstacle could be calculated by iteratively

searching the array of distance values and recording the minimum value found. Once this was

determined, the distance to the centre of the obstacle could be determined as shown below.

𝐶𝑒𝑛𝑡𝑟𝑒𝐷𝑖𝑠𝑡 = 𝑚𝑖𝑛𝑉𝑎𝑙𝑢𝑒 +𝐷

2

The angular offset of the obstacle relative to the robot (Ф) was required in radians. This could be

calculated from the start/end positions of the obstacle and size of the scan array through the

following equations.

𝑂𝑓𝑓𝑠𝑒𝑡𝑆𝑡𝑒𝑝 = (𝑆𝑡𝑎𝑟𝑡𝑆𝑡𝑒𝑝 + 𝐸𝑛𝑑𝑆𝑡𝑒𝑝)/2 −𝐴𝑟𝑟𝑎𝑦𝑆𝑖𝑧𝑒

2

Ф (𝑟𝑎𝑑) = 𝑂𝑓𝑓𝑠𝑒𝑡𝑆𝑡𝑒𝑝 ∗𝜋

100

X

Y L

W D

Wall D

ist.

Local Y Offset

Local X

Offset

Ф

Page 20: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 19

Date

Once the angle of offset and distance to the centre of the obstacle were determined these could

be converted between polar and Cartesian co-ordinates using basic trigonometry and the

diagram shown below.

𝐿𝑜𝑐𝑎𝑙𝑋𝑂𝑓𝑓𝑠𝑒𝑡 = 𝐶𝑒𝑛𝑡𝑟𝑒𝐷𝑖𝑠𝑡 ∗ 𝑠𝑖𝑛(Ф)

𝐿𝑜𝑐𝑎𝑙𝑌𝑂𝑓𝑓𝑠𝑒𝑡 = 𝐶𝑒𝑛𝑡𝑟𝑒𝐷𝑖𝑠𝑡 ∗ cos (Ф)

Obstacle Avoidance

X Movement

If the robot was to the right of the obstacle and there was sufficient distance between the wall

and obstacle for the robot to fit through then the robot would move right around the obstacle.

These two conditions are represented through the equations below.

𝐿𝑜𝑐𝑎𝑙𝑋𝑂𝑓𝑓𝑠𝑒𝑡 < 0

𝑊𝑎𝑙𝑙𝐷𝑖𝑠𝑡 > 𝐿𝑜𝑐𝑎𝑙𝑋𝑂𝑓𝑓𝑠𝑒𝑡 + 𝑊 + 𝐷 + 2 ∗ 𝐶𝑙𝑒𝑎𝑟𝑎𝑛𝑐𝑒

Movement right around the obstacle was accomplished by setting the local X movement as

below.

𝐿𝑜𝑐𝑎𝑙𝑋𝑀𝑜𝑣𝑒 = 𝐿𝑜𝑐𝑎𝑙𝑋𝑂𝑓𝑓𝑠𝑒𝑡 +𝑊

2+

𝐷

2+ 𝐶𝑙𝑒𝑎𝑟𝑎𝑛𝑐𝑒

If the robot was either to the left of the obstacle or obstacle was too close to the wall to fit the

robot then the robot would move left. This was accomplished as follows.

𝐿𝑜𝑐𝑎𝑙𝑋𝑀𝑜𝑣𝑒 = 𝐿𝑜𝑐𝑎𝑙𝑋𝑂𝑓𝑓𝑠𝑒𝑡 −𝑊

2−

𝐷

2− 𝐶𝑙𝑒𝑎𝑟𝑎𝑛𝑐𝑒

Ф

Local Y Offset

Local X

Offset

Page 21: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 20

Date

Y Movement

The y movement was broken up into two stages. Firstly, the robot had to move until it was close

to the obstacle (as scanning did not take place right in front of the obstacle. This was done as

follows.

𝐿𝑜𝑐𝑎𝑙𝑌𝑀𝑜𝑣𝑒 = 𝐿𝑜𝑐𝑎𝑙𝑌𝑂𝑓𝑓𝑠𝑒𝑡 −𝐿

2−

𝐷

2− 𝐶𝑙𝑒𝑎𝑟𝑎𝑛𝑐𝑒

Then, once the robot had moved to the side of the obstacle it had to move forward far enough

so that the rear of the robot had cleared the obstacle before re-initiating wall-following. This

was done as follows.

𝐿𝑜𝑐𝑎𝑙𝑌𝑀𝑜𝑣𝑒𝑇𝑤𝑜 = 𝐿𝑜𝑐𝑎𝑙𝑌𝑂𝑓𝑓𝑠𝑒𝑡 +𝐿

2+

𝐷

2+ 𝐶𝑙𝑒𝑎𝑟𝑎𝑛𝑐𝑒

Wall obstacle case

If a start position is detected for the obstacle but no end position then it must be against a wall.

If this is the case then a slightly different approach is taken for robot movement.

The obstacle is assumed to be directly in front of the robot and since no end position can be

obtained all calculations are based on the centre position and the start position. An offset angle

of zero is assumed and everything is calculated as such. The other difference is that the robot

always moves left around the obstacle, and its Local X Offset is assumed to be zero.

Movement arrays

Once the x and y movements had all been determined the final step was to place them in arrays

to determine the exact robot path. Two arrays were created, with 3 cells in each, to represent

the changes in local X and local Y co-ordinates respectively. The path of robot, and population of

the arrays is best represented graphically as seen in Figure 16 below.

Page 22: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 21

Date

Figure 16: Local X and Y obstacle avoidance path

X

Y

Local Y Move

Local X

Mo

ve

Local Y Move Two –

Local Y Move

Page 23: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 22

Date

Finite State Machine The highest level of intelligence for the robot is the FSM, this determined the different modes of

operation for the robot. This FSM is defined in the main sketch of the program, each state call

functions from the set of classes required to run the robot.

Figure 17 - FSM Diagram

Start

Running

Collision

Detected

Avoid

Collision

Stop

Page 24: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 23

Date

Initializing

This state runs the initialization code for the robot at first run, this is where the setup functions

for the lower level classes were called.

Start

This state is the first to be called after Initializing which instructs the robot to move, the purpose

of this state is to home the robot to a corner by finding a wall becoming parallel to it then

moving along the wall until a corner is found. Because there are two phases to the start state in

terms of the robots orientation (Finding a wall and finding a corner), if the state exits to handle

collision detection it must know where it got to when it re-enters the start state. For example if

the robot became parallel to a wall, detected an obstacle and consequently transitioned to the

collision detection state then when the collision had been avoided the robot transitioned back

to the start state. The state would need to skip to half way through i.e. not instructing the robot

to find a wall, instead find a corner.

When the robot first starts up there are 3 basic steps it completes. Throughout all these steps,

obstacle detection and avoidance is run. During the start state, when a blockage is identified as

an obstacle, the robot rotates 20˚ and rescans, this is done continuously until an orientation is

found in which there are no obstacles blocking the wall.

First, it drives straight to the closest wall at a moderate. It continues doing this until it is within a

set distance of that wall, deemed a suitable distance for scanning and judging perpendicular

angles.

First, it drives straight to the closest wall at a moderate speed. It continues doing this until it is

within a set distance of that wall, deemed a suitable distance for scanning and judging

perpendicular angles.

The final stage of movement for the robot is to drive forwards until a corner is reached. Once

the robot is within an acceptable distance of the corner it rotates 90˚ counter clockwise once

more, placing it in the back right corner of the room. The robot can now begin coverage of the

room.

The 4 stages of initial positioning are represented graphically in Figure 18 below.

Page 25: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 24

Date

Figure 18: 4 stages of initial positioning in the start state

1

2 3

4

Page 26: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 25

Date

Running

This state is the main portion of the code, it runs the robot through the spiral wall following

algorithm. For each run of this state it updates the robots position checks for a collision and

handles any changes in direction based on a collision information. If the collision detection

algorithm “identifyblockage” returns its found a wall then the running state runs code to handle

the wall following behavior. If “identifyblockage” returns it has found a wall then the running

state exits and returns to the FSM to transition to the collision detected state.

Collision Detected

If this state is called then “identifyblockage” called from either the start state or running state

has indicated that an object has been detected. The purpose of this state is to run the required

code needed to calculate how to avoid the object. The output of this state is the generation of

an array of absolute positions the robot must travel to avoid. This state returns to the FSM to

transition to the avoid collision state.

Avoid Collision

This state is called from collision detected, it runs the robot through a position array of points to

avoid an obstacle. Once finished based on the state where the collision was originally detected

the FSM transitions back to that state for example either the running or start state.

Stop

Once the main running state determines that the map is complete it trasitions the FSM to this

state. The purpose of this stop state is to disable the control systems and therefore stop the

robot.

Code Structure The robot is run through multiple classes, each designed for a specific task. All the classes belong

to a hierarchy, with the top level controllers declared in the main sketch. Each top level

controller may contain other minor classes to assist with even more specific tasks. A brief

rundown of these controllers is given, along with how they operate in conjunction with each

other.

DirectionController

o This controller is used to move the robot to set points, control the speed of the

robot, correct the yaw orientation, and control the wall following algorithm.

This class receives instructions based on the Sensor data, and contains the

Motor controller class.

SENSORS

o This class handles taking measurements for the two IR sensors, the MPU, and

moving the stepper motor to perform sweeping scans. It populates a global

array with scan data to be used by other classes.

Page 27: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 26

Date

MapMaker

o This class handles the construction and addition of data to a map of the robot’s

surroundings in memory. It reads data from the SENSORS class, and position

information from the DirectionController class.

CollisionDetection

o This class handles all the logic surrounding the identification of any blockage to

the robot’s path (wall or obstacle), and calculates the path required to avoid

obstacles. It used the SENSORS scan functions.

Motor

o There is an instance of this class for each of the 4 motors on the robot each

instance contains an equation that represents the motors PWM to RPM

relationship. When a motor is instructed to move it is sent a RPM speed which is

then converted to the required PWM.

MotorController

o This class takes X Y and Yaw speeds from the DirectionController class and

converts them into the speeds required from each motor to achieve the desired

motion. The class also instantiates the four instances of the motor class

therefore knowing the characteristics of each motor.

Figure 19 - Class Diagram

Page 28: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 27

Date

Sensors

MPU The 9 axis motion processing unit (MPU) is used to provide accurate information about the yaw

heading of the robot, this enables the direction controller to control the heading of the robot

using a closed loop controller. The MPU used is the InvenSense MPU-9050, which communicates

over the I2C bus with the Arduino MEGA

The MPU contains 3 axis gyro for sensing the angular rate of change of the robot, a 3 axis

accelerometer which detects acceleration and a 3 axis magnetometer which can detect the

strength of the surrounding magnetic field. The MPU contains an on-board digital motion

processor (DMP) capable of performing advance data fusion usually completed on the

microprocessor connected to the MPU. This offloading of the data fusion to the DMP allows

increased operation speed of the Arduino MEGA as it is not being slowed down by complex

algorithms.

To program the MPU a set of internal registers require setting on each start-up of the robot, a

further set of data has to be written to the MPU to enable the DMP functionality. To help with

this setup routine a library was created by Jeff Rowberg [1], this library uses data from the

InvenSense MPU-6050 register map [2] which is an almost identical MPU but lacks the

magnetometer.

The DMP data is accessed over the I2C bus through a set of registers forming a First-In-First-Out

(FIFO) queue, which contains individual sensor readings and DMP quaternion data. The MPU has

an interrupt pin which can be used to inform the Arduino MEGA that the FIFO queue has been

filled i.e. new data is available. Upon using this interrupt problems were encountered with the

interrupt adversely affecting the operation of other portions of code which should not be

interrupted. To get around this the interrupt pin is not utilised and instead the FIFO queue is

constantly polled checking if new data is available. A problem encountered with polling the FIFO

queue is that if it is not polled at a high enough rate the queue can overflow which causes the

MPU to give errors out and proceed to stop running. To ensure the queue is read often enough

the update rate of the MPUs DMP was decreased to 50𝐻𝑧 and the queue is read every loop of

the main FSM loop.

The DMPs quaternion data requires re-orientating to a more useable format for use with the

robot, this was achieved by converting the quaternion to roll-pitch-yaw rotations using functions

built into the MPU library.

The re-orientated DMP data only fuses the gyro and accelerometer data and does not include

the magnetometer data. On the Arduino MEGA the data was fused with the magnetometer for

just the yaw orientation. This was achieved by using a complimentary filter which through

experimentation was found to require a 0.7 weighting for the magnetometer heading and a thus

a 0.3 weighting on the DMP data, the equation used is shown below:

𝑌𝑎𝑤𝐹𝑖𝑙𝑡𝑒𝑟𝑒𝑑 = 0.3 × 𝑌𝑎𝑤𝐷𝑀𝑃 + 0.7 × 𝐻𝑒𝑎𝑑𝑖𝑛𝑔𝑀𝑎𝑔

Page 29: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 28

Date

The filtered yaw now showed absolute heading relative to north, but it was noted that when

motors were operated near the MPU the heading became unreadable, this was believed to be

due to electromagnetic interference (EMI). Mounting of the MPU at differing locations was

investigated to minimise the EMI but it was decided that the risk associated with fusing the

magnetometer data was not worth the slight increase in performance.

The MPUs DMP requires axis offsets to be calculated for the gyro and accelerometer, these

offsets allow for a much faster initialisation of the MPU as it has a starting point for the

calibration procedure. The calculation of these offsets was performed using Jeff Rowbergs

MPU6050_calibration.ino sketch which waits for the MPUs values to stabilise and returns the

required offsets. It was found that every time the MPU was remounted these offsets required

recalculating as the MPUs orientation changes very slightly. The final offsets used are shown

below:

𝑋𝐺𝑦𝑟𝑜𝑂𝑓𝑓𝑠𝑒𝑡 = 109

𝑌𝐺𝑦𝑟𝑜𝑂𝑓𝑓𝑠𝑒𝑡 = −9

𝑍𝐺𝑦𝑟𝑜𝑂𝑓𝑓𝑠𝑒𝑡 = −8

𝑋𝐴𝑐𝑐𝑒𝑙𝑂𝑓𝑓𝑠𝑒𝑡 = −5363

𝑌𝐴𝑐𝑐𝑒𝑙𝑂𝑓𝑓𝑠𝑒𝑡 = −788

𝑍𝐴𝑐𝑐𝑒𝑙𝑂𝑓𝑓𝑠𝑒𝑡 = 1058

These offsets improved the DMPs initialisation time from approximately 8 𝑠𝑒𝑐𝑜𝑛𝑑𝑠 to less than

1 𝑠𝑒𝑐𝑜𝑛𝑑 and completely eliminated yaw drift.

To get the most accurate MPU data the MPUs was located as close to the center of rotation, this

minimises rotation induced accelerations (centripetal and tangential) that are perceived by the

accelerometer as described by Wei, Khosla & Riviere, 2003 [3]. To further reduce any drift in the

yaw output from the DMP the MPU was soft mounted the Robot using double sided foam tape,

this acts as a high frequency vibration damper (low pass filter for the accelerometer) while also

mounting the MPU securely.

Page 30: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 29

Date

Infrared Proximity Sensor

The supplied sharp Infrared Proximity Sensors GP2Y0A21 and GP2Y0A02 both take

approximately 40 𝑚𝑠 to measure distance and require a 5𝑉 supply which requires a 10𝜇𝐹

bypass capacitor across Vcc and GND to ensure sufficient supply current [4].

Using a power supply and oscilloscope the power in Vcc line was probed to check the noise

before and after the addition of a bypass capacitor. It was

found that the 5𝑉 supply dropped as much as 30% which

would adversely affect the analog output, hence a 10𝜇𝐹

ceramic surface mount capacitor was soldered directly

between Vcc and GND on the sensor. Mounting the

capacitor directly to the sensor helps to remove losses in

the wires from the power supply and creates a smaller

package not requiring any external interface board before

connecting to the Arduino MEGA.

The supplied datasheets [4] describes that the sensor orientation relative to movement in the

perpendicular plane of the sensors direction is best in the vertical orientation depicted in Figure

21.

To make the best use of the analog output of the IR sensors the analog reference for the

Arduino MEGA was set to the 3𝑣3 reference as the output from the IR sensors ranged from

0𝑉 𝑡𝑜 3𝑉, this meant that the analog to digital converter (ADC) resolution was used fully.

To linearise the IR Sensors a plot of 1/analog voltage vs. distance was created for the sensors

range, this was because the relationship described in the sensors datasheet described an inverse

relation [4]. To gather this data an average of 100 readings taken every 50𝑚𝑠 at each distance

was recorded with an Arduino UNO communicating over the serial port, using an Arudino UNO

allowed for uninterrupted use of the robot while others focused on the sensor calibration.

The plots Figure 23 & have had a linear curve fit applied to find the equation for the line for the

long range (200-1500mm) sensor and the short range sensor (100-800mm) respectively. Both

best fit lines have very good R2 values 0f around 0.992 & 0.998 respectively.

Figure 20 SMD Decoupling Capacitor

Figure 21 Correct Sensor Orientation

Page 31: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 30

Date

The output from the IR sensors was jittery and required post filtering to smooth the output. This

was accomplished through the use of a rolling average of the 15 consecutive values, the average

was only taken if the 15 consecutive values fell within 98% of each other. This helped to create

distance outputs that were accurate to within 5mm.

Due to the highly non-linear output from the IR sensors, when the distance to be measured was

outside the range of the sensors it would report a range that was just inside either end of the

measurable distances. To help negate this problem the effective output of the sensors was

decreased to 100 − 600𝑚𝑚 for the short range sensor and 200 − 1000𝑚𝑚 for the long range

sensor, any distance outside these ranges is reported as 0𝑚𝑚. The sensors have also been

positioned such that the minimum range of the sensors is within the robots size and so no wall

or obstacle can get within this minimum range.

Sonar A large portion of time was spent on trying to get consistent and repeatable distance values

from the sonar range finder. The analog, pulse width and serial output were all used but even

with filtering method like used for the IR sensors a reliable output could not be achieved and

Figure 23 Short Range IR Calibration

y = 113462x + 34.226R² = 0.9924

0

200

400

600

800

1000

1200

1400

1600

0 0.002 0.004 0.006 0.008 0.01 0.012 0.014

GP2Y0A02 Calibration

Figure 22 Long Range IR Calibration

y = 324.31x - 40.404R² = 0.9981

0

100

200

300

400

500

600

700

800

900

1000

0 0.5 1 1.5 2 2.5 3

GP21YA02 Calibration

Page 32: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 31

Date

considering there was four IR sensors for use the robot the sonar range finder was abandoned in

favor of sweeping an IR sensor back and forth to cover

the same area as the sonar does.

Scanning

To rotate the forwards facing IR sensor a stepper motor

was implemented. A stepper motor has several

advantages over the supplied RC servo motor, including

precise and repeatable steps of 1.8° and the ability to

rotate a full 360°. To initialize the stepper motor a

known position a photo interrupter is used as a switch to

detect when the stepper faces forwards. The circuit

diagram for the photo interrupter is shown in Figure 25

[5].

To drive the stepper motor a simple dual H-bridge bi-polar

stepper driver was used. Contorlling the two H-bridges is done

using four of the ARDUINO MEGAs digital pins an dthe

AccelStepper library provided by Mike McCauley under a

GPL V2 license agreement. This library made it simple to

command the stepper to move to a set position.

Sensor Class

A sensor class was created that contains functions for accessing each sensors data via pointers

and has the following functions: start() which sets up all the sensors, irRead() which updates an

array with IR range finder readings, frontScan() which is used to scan between a set angle in

front of the robot and populate a scan array with IR distance values from every 1.8°,

checkFront() which is used while driving to sweep back and forward and detect any obstacles in

front of the robot while it is moving forwards and finally scanThreePoints() which returns the

distance to three equally spaced angle to distinguish between a wall and an obstacle.

Figure 24 Scanning Head Photo Interrupter

Figure 25 Photo Interrupter Circuit

Page 33: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 32

Date

3D Design The mechanical design of the robot went through an iterative process in which parts were

incrementally improved until the desired functionality was reached. The main manufacture

method used was 3D printing for mounting all the sensors to the supplied robot chassis.

To mount the stepper motor to the robot a mount was designed that could also mount an IR

range sensor on each side. This part went through three major revisions depicted in below in

Figure 26 & Figure 27.Figure 26 Stepper mount Design Revisions

The scanning range finder mount went through four major

revisions and ended up having a shroud to improve beam size of the sensor. The sensor ended

up being horizontal so that it would not overshoot the wall of the arena, although this is not the

ideal orientation it did not appear to affect the performance of the sensor.

The wheel hubs that were provided had excessive play and meant that during robot direction

changes the robot could jerk slightly in the wrong direction leading to errors in the odometry. To

remedy this new hubs were printed that were a press fit on the robot.

Below in Figure 30 is the finalized mountings attached to the robot, only one long range and one

short range IR range finders for the scanning and wall following respectively were required along

with the MPU.

Figure 26 Stepper mount Design Revisions Figure 27 FStepper Mount CAD

Figure 28 Scanning Head Design Iterations

Page 34: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 33

Date

Mapping

Overview

The mapping algorithm is controlled by a class object called MapMaker. This object refers to the

global data containers for the current robot position, the current IR readings and the current

yaw heading, and uses this data to construct a map of its surroundings. Since the odometry of

the motor control has been well tuned, we have found that the position data is sufficiently

accurate to give an accurate map of the surroundings.

The map is able to detect differences between walls, empty space and obstacles, and displays

them as an ASCII art drawing in the serial console when the ‘p’ character is entered.

The map is drawn by building an intensity grid of measured points. If a point on the map is

recorded more than once, its intensity is increased, and is therefore considered more reliable,

and more likely to be a true reading. This approach allows us to eliminate false readings due to

noise in sensors, or other sporadic readings.

Most of the mapping testing was done using Visual Studio, since this class only outputs to

console, and the IR and position inputs are simple enough to simulate. When ported to the

Arduino, there were a few small changes required, such as console print commands, and

adjusting of threshold values. For the most part however, the simulation in Visual Studio

provided an accurate prediction of the mapping behavior on the robot.

Algorithm The mapping class is initialized in the setup() function in the main Arduino sketch. This is done

by calling the initialise() function, and providing references to the current position coordinates,

current IR readings and current heading.

Figure 30 Final Robot Hardware Design Figure 29 Final CAD Design

Page 35: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 34

Date

While the program is running, the function addLaserReadings() is called at regular intervals.

When this is called, the mapper copies the data from the global arrays so that this data is not

affected. These values are then filtered using getFilteredData(). This function applies a number

of filters to the global data, including applying the origin offset due to the start position of the

robot; coercing data outside of the map’s allowable range due to noise, and proper alignment of

the yaw reading.

After the data has been filtered, the IR distances are added to the current position values to get

an obstacle point in the map coordinate system. The yaw orientation is used to determine

whether IR distances should be added or subtracted from the position coordinates. These

positions are stored in a 2D intensity array, and the value at the corresponding cell is

incremented, up to a value of 254.

The map used the value 255 as a special case to indicate the path that the robot has traversed.

This way the map can show the positions of walls and obstacles, and also show the path that the

robot has traversed around these obstacles.

The map can be printed at any time by pressing the ‘p’ key. When this key is pressed, the

mapper runs the printDisplayMap() function. This function cycles through each map cell, and

compares the intensity to a display threshold (calculated as a function of the highest intensity

recorded). If higher than the threshold, then the map prints a wall character in this cell.

Otherwise the point is not considered reliable enough to be considered a wall. The print

function also checks for the values to indicate the robot’s path, and prints these where

appropriate.

Page 36: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 35

Date

Conclusions This project has been very successful and a great learning experience. Unfortunately on the day of assessment there were issues with the stepper motor mounted Long-Range Infrared sensor which caused non-ideal performance in our final demonstration.

However, this problem was fixed after the assessment when the IR sensor was replaced on the afternoon after the demonstration and proof of satisfactory performance was obtained both before and after the demonstration. On the morning of Thursday 4th July (the day before the demonstration) there is a video showing the robot traversing the entire course with only one minor collision. After some minor code fixes, even this issue was fixed. There is then a computer vision map which was obtained on the evening of Friday 5th July (after the demonstration) showing successful navigation of the room with no obstacle collisions. This computer vision map can be seen on, proving successful operation of the SLAM robot.

Things that could have been improved in this project include the verification of correctly functioning hardware. This would have saved us having a poor performance in the demonstration and ensured more reliable operation of the robot. Another aspect that we would do differently next time is to ensure a consistent coding standard between group members. One difficult thing about working in a medium-large group is understanding each other’s code and ensuring outputs and inputs of various modules are matching.

Aspects of this project that did work well include the minimization of sensor usage. Our robot only required 2 IR sensor and an MPU, ensuring cheap production of the robot and less hardware components to rely on. The entire hardware of the robot was also really well contained giving a sleek outward appearance and ensuring protection of important wires and sensors. Another great aspect of the project was the modular style of the code, which allows for easy modification of the code for future robots when requirements have changed. In addition to this it allowed for easy debugging of the code in Microsoft Visual Studio during development. The final part of the project that worked really well was the orthogonality. The robot managed to easily get parallel to a wall and stay orthogonal at all times, ensuring accurate movement and complete coverage of the map.

Overall, this project has gone really well and the final performance of our robot worked out great once hardware issue were solved. There a lot of great potential applications for this robot and it could be easily transferred to other tasks if small parts of it are changed.

Page 37: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

Title 36

Date

However, this problem was fixed after the assessment when the IR sensor was replaced on the

afternoon after the demonstration and proof of satisfactory performance was obtained both

before and after the demonstration. On the morning of Thursday 4th July (the day before the

demonstration) there is a video showing the robot traversing the entire course with only one

minor collision. After some minor code fixes, even this issue was fixed. There is then a computer

vision map which was obtained on the evening of Friday 5th July (after the demonstration)

showing successful navigation of the room with no obstacle collisions. This computer vision map

can be seen on, proving successful operation of the SLAM robot.

Figure 31 Final Robot Path When Operational

Page 38: SLAM Robot Project - Aucklandhomepages.engineering.auckland.ac.nz/~pxu012/mechatronics2015/group8.pdf · Title 1 Date Executive Summary This report details the design, development

References [1] http://www.i2cdevlib.com/, Jeff Rowberg, 2010

[2] http://www.invensense.com/wp-content/uploads/2015/02/MPU-6000-Datasheet1.pdf,

InvenSense

[3] http://ieeexplore.ieee.org/xpls/icp.jsp?arnumber=1241852, Wei, Khosla & Riviere, 2003

[4] https://www.sparkfun.com/datasheets/Components/GP2Y0A21YK.pdf, Sharp IR Datasheet,

Sharp Electronics

[5] http://www.martyncurrey.com/connecting-an-photo-interrupter-to-an-arduino/, Curry,

2008