45
MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION Autonomous SLAM Robot Raniera Paniora-Hepi (6851420), David Robinson (6161032), Matthew Young (9650691), Rachel Mosen (6457765)

MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

  • Upload
    dinhdan

  • View
    225

  • Download
    1

Embed Size (px)

Citation preview

Page 1: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

MECHENG 706:

MECHATRONICS DESIGN

AND INTEGRATION Autonomous SLAM Robot

Raniera Paniora-Hepi (6851420), David Robinson (6161032), Matthew Young (9650691), Rachel Mosen (6457765)

Page 2: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

i

Executive Summary A simultaneous localisation and mapping (SLAM) algorithm was implemented in the

development of an autonomous vacuum cleaning robot. A hardware unit comprised of servo

motors, a VEX robotic platform, and sensors were provided. Thorough testing of the sensors

validated their ranges and output characteristics, which was then used in the control of the

robot.

To maximise the covered area in minimal time, the robot drove in an inward spiral path. An

obstacle avoidance algorithm was integrated into the system to prevent any collisions between

the robot and its surroundings during a run. For complex control, a detailed finite state machine

was also developed.

Localisation and mapping was achieved by linking the sensor data, and the state and control

inputs of the robot. The localisation used two stages of dead reckoning, followed by localisation

using the mapped walls as a reference. The map model was very accurate to the actual

traversed path. The computation of the SLAM was done on an external computer by porting the

necessary data through Bluetooth. Unity was used to display a 2.5D map computed through a

live serial stream.

The robot successfully traversed the testing track with three obstacles in 1.25 minutes,

achieving 80-90% coverage of the room. The robot did not have any collisions with the walls or

obstacles during the entire run, proving the robot’s overall efficacy to complete the task.

Page 3: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

ii

Table of Contents Executive Summary ..................................................................................................................... i

1.0 Introduction .......................................................................................................................... 1

2.0 Problem Description and Specifications ............................................................................... 1

3.0 System Design and Integration ............................................................................................ 2

3.1 Sensors ............................................................................................................................ 2

3.1.0 Overview .................................................................................................................... 2

3.1.1 Characterizing the sensors ......................................................................................... 3

3.1.2 Placement .................................................................................................................. 4

3.1.3 Filtering ...................................................................................................................... 6

3.1.4 Calibration .................................................................................................................. 6

3.2 System Architecture ......................................................................................................... 7

3.3 Control .............................................................................................................................. 8

3.3.0 Overview .................................................................................................................... 8

3.3.1 Initialisation State ......................................................................................................10

3.3.2 Driving Forward State................................................................................................10

3.3.3 Corner State ..............................................................................................................10

3.3.4 Check if Clear State ..................................................................................................11

3.3.5 Turn Right State ........................................................................................................11

3.3.6 Strafe Left State ........................................................................................................11

3.3.7 Drive Past Obstacle State .........................................................................................11

3.3.8 Square and Strafe State ............................................................................................11

3.5 SLAM ..............................................................................................................................12

3.5.0 Approach ..................................................................................................................12

3.5.1 Implementation .........................................................................................................12

6.0 Testing, Results and Discussion .........................................................................................17

6.1 Testing ............................................................................................................................17

6.1.0 Robot Performance Testing ......................................................................................17

6.1.1 Sensor Testing ..........................................................................................................18

6.2 Results ............................................................................................................................18

6.3 Discussion .......................................................................................................................19

9.0 Conclusions ........................................................................................................................20

References ................................................................................................................................. I

Page 4: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

iii

Appendices ................................................................................................................................. I

Appendix A - Parts list ............................................................................................................. I

Appendix B - Circuitry/Wiring ................................................................................................... I

Appendix C – Robot Control Code .......................................................................................... II

Appendix D – Dead Reckoning Code ................................................................................... VII

Appendix E – Wall Mapping Code ....................................................................................... VIII

Appendix F – Localisation Code ............................................................................................ IX

Appendix G – Obstacle Mapping Code .................................................................................. XI

Appendix H – Occupancy Grid Mapping Code ..................................................................... XIII

Table of Figures Figure 1 - Testing area with three obstacles ............................................................................... 1

Figure 2 - Robot showing axes ................................................................................................... 2

Figure 3 - Infrared characterisation ............................................................................................ 4

Figure 4 - Sensor placement ...................................................................................................... 5

Figure 5 - Comparison of filtered and unfiltered IR sensor readings ........................................... 6

Figure 6 - System Architecture ................................................................................................... 7

Figure 7 - Spiral robot path......................................................................................................... 8

Figure 8 - FSM ........................................................................................................................... 9

Figure 9 - Robot rotation during initialisation .............................................................................10

Figure 10 - Corner detected ......................................................................................................10

Figure 11 - Obstacle detected ...................................................................................................10

Figure 12 - Check if clear ..........................................................................................................11

Figure 13 - Strafe left ................................................................................................................11

Figure 14 - Drive past obstacle .................................................................................................11

Figure 15 - SLAM diagram ........................................................................................................12

Figure 16 - Direction of movement relative to robot ...................................................................13

Figure 17 - Incorrect path due to error accumulation .................................................................13

Figure 18 - Orientation of robot using MPU ...............................................................................13

Figure 19 - Regression fit equation ...........................................................................................14

Figure 20 - Regression line fit on map .......................................................................................14

Figure 21 - Mapping of robot in forwards driving state ...............................................................14

Figure 22 - Mapping of robot in strafing state ............................................................................15

Figure 23 - Robot pose from dead reckoning when an obstacle is detected ..............................15

Figure 24 - Obstacle sample points from side IR sensors .........................................................16

Figure 25 - Obstacle positioned on map ....................................................................................16

Figure 26 - Position of obstacle calculated ................................................................................16

Figure 27 - Feature based map .................................................................................................16

Figure 28 - Discretized occupancy grid map .............................................................................17

Figure 29 - Wheel collision with wall during cornering ...............................................................17

Figure 30 - Mapping of robot and room from demonstration ......................................................19

Page 5: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

iv

Table of Tables Table 1 - Sensor Properties ....................................................................................................... 3

Table 2 - Performance during demonstration ............................................................................19

Page 6: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

1

1.0 Introduction The aim of this project was to design and develop a robot that would autonomously cover the

entire area of any four-walled room without any collisions into obstacles or walls. Simultaneous

localisation and mapping (SLAM) was to be implemented to determine the position of the robot,

and build a map of the room as the robot navigated around it.

A VEX robotic platform was provided, along with a variety of sensors that were to be mounted in

positions deemed optimal by the team. These sensors included long and medium range infrared

(IR) sensors, a sonar sensor and a nine degree of freedom motion processing unit (MPU). The

sensors were to be used for obstacle and wall detection, and to determine the position of the

robot. A Bluetooth module was also provided which was to be used to transmit data to a

computer for mapping of the room.

The robot design detailed in this report uses two side facing and two forward facing IR sensors,

along with a forward facing sonar sensor. This arrangement allows the robot to keep it’s driving

path straight, while also detecting walls and obstacles in front of it. Sensor data is filtered, and

then used to control the robot movements through a finite state machine (FSM). Localisation

and mapping uses a program developed with Unity to display the path of the robot and the

room, using the data obtained from the sensors via Bluetooth.

2.0 Problem Description and Specifications A robot was to be designed, based on an autonomous vacuum cleaning robot, such that the

entire surface of any room can be covered without any collisions with obstacles or walls. No

vacuuming apparatus was to be investigated.

The robot was to be autonomous and decide its own path in order to achieve complete

coverage of the room. The room will be 2 x 1.2 metres in which three cylindrical obstacles will

be randomly placed, as shown in Figure 1. The robot will also be placed in a random position

with a random orientation at the start of the run. The robot must cover the entire room in the

smallest amount of time, and must not collide with any of the three obstacles or walls. The path

of the robot must also be recorded, and displayed on a map to prove that it has achieved

complete room coverage. The three determining factors of the quality of the robot’s functionality

are area covered, obstacle avoidance, and the time taken for full coverage of the room.

Figure 1 - Testing area with three obstacles

Page 7: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

2

The robot is to use a VEX robotic platform, which includes motors and drivers, four Mecanum

wheels, a battery pack and an Arduino Mega. Teams were to program the microcontroller on an

Arduino Mega in order to achieve this functionality. The provided sensors, including 4 IR

sensors, a sonar sensor and an MPU, can be mounted onto the robot in any position. The

mountings were to be designed by the team, and then manufactured using methods such as 3D

printing.

3.0 System Design and Integration The VEX robotic platform has four servo motors that rotate the Mecanum wheels, that are

controlled through pulse width modulation, in order to move the robot. The Mecanum wheels

allow the robot to manoeuvre around confined spaces, and was controlled so that it would move

backwards and forwards, side to side, and rotate in place in order to navigate around the room

without colliding with any obstacles and walls of the room. The completed robot is shown in

Figure 2 with its axes.

3.1 Sensors 3.1.0 Overview

The six sensors that were provided for this project were all used on the robot, allowing the

largest amount of information to be obtained. This data was used for the purposes of driving the

robot, obstacle avoidance, path tracking, and mapping. Table 1, shown below, details the

important properties about the IR and sonar sensors.

Y

Z

Figure 2 - Robot showing axes

Y

X

Page 8: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

3

Properties Medium Range IR Sensors

Long Range IR Sensors Sonar Sensor

Useful Minimum Range (cm)

5 10 2

Useful Maximum Range (cm)

100 200 400

Speed (Hz) As fast as DAC As fast as DAC 50

Accuracy Medium Medium High

Detection Width Narrow Very Narrow Wide

Table 1 - Sensor Properties

The sonar sensor was very accurate over a large range of distances, and had a wide area that it

could cover making it useful for precise obstacle and wall readings. The IR sensors had a

narrower detection width and were less accurate. Through filtering and calibration, the IR

sensors provided usable measurement readings for walls and obstacles. The MPU was used to

obtain the heading of the robot which aided in mapping the orientation of the robot, and also

determined when the robot had completed its run.

The wiring of the sensors on the Arduino Mega shield can be seen in Appendix D.

3.1.1 Characterizing the sensors

IR Sensors

For the IR sensors, the relationship between the analogue voltage reading from the sensor and

the distance it represents can be approximated by an inverse power function. This relationship

is computationally intensive to perform, and slowed down the Arduino’s processor. A simpler

relationship was trialled where a curve was fitted to the middle of the data, fitting both upper and

lower data points, however it was found to be too inaccurate. A lookup table was then

determined to be the best option, as it would allow the inverse power relationship to be

implemented without slowing down the processor. MATLAB’s interpolation functionality was

utilised to generate a table with distance measurements in millimetres for every voltage reading

that the sensor is expected to output. This method results in more memory being used on the

Arduino, however the significant reduction in computation time is a big advantage as it won’t

slow down the processor.

As illustrated in Figure 3, the interpolation between points was linear as they were accurate

enough for the performance of the robot.

Page 9: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

4

Figure 3 - Infrared characterisation

Sonar Sensor

The digital output from the sonar sensor has a linear relationship between the echo pulse

length, and the distance measurement. Only a gain was needed to obtain the distance from the

voltage output by the sensor, so no lookup table was required.

3.1.2 Placement

The sensors are situated on the outer edges of the robot, facing outwards, as shown in Figure

4. This means that any obstacles or walls will be detected over the whole width of the robot, and

prevents blind spots which could result in a collision. The sensors were found to be the most

accurate at smaller distances. Having them along the outer edges of the robot positions them as

close to the walls and obstacles as possible, resulting in more accurate readings, without

increasing the size of the robot and the potential for a collision.

Page 10: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

5

Forward Facings Sensors

Two long range IR sensors are located at the front of the robot facing forwards, one on either

side of the sonar sensor. Due to the sensor placement and the width of the obstacles, when the

robot is approaching them only one of the IR sensors will pick up the obstacle. This allows

obstacle detection to be performed. The sonar sensor is the most accurate, and does not vary

as much as the IR sensors, so only the reading from the sonar sensor is used for mapping.

The option of mounting the sonar sensor onto a servo to cover more forward area was rejected

due to the complexity of polar-to-cartesian conversions on the 8-bit Arduino Mega processor.

Furthermore, it was anticipated that the sweeping view of the sonar would be to a small benefit

compared to a static mount.

Side Facing Sensors

Medium range IR sensors are located on the right side of the robot. These sensors are used by

the robot to get square with the wall and drive straight. The difference between the two readings

indicates whether the robot is driving at an angle towards or away from the wall, and is used to

make slight corrections to ensure that the robot in moving in a path parallel to the wall. In this

position these sensor readings are also able to be averaged out, as they are in line with each

other, and will give a more accurate position of the robot for mapping.

MPU

Readings from the MPU about the z-axis are used to get the heading of the robot. This required

the MPU to be placed with the z-direction being upwards. The MPU was on a flat surface on the

robot at the back, where it would not be in the way of the other sensors.

Figure 4 - Sensor placement

Page 11: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

6

3.1.3 Filtering

IR Sensors

After initial testing it was found that the readings from the IR sensors often had large spikes. A

mean based filter was initially trialled, however the spikes caused the output to shift too much

and made the filter ineffective. A rolling median filter was implemented instead. The filter uses a

number of samples when calculating the median. During testing it was found that using a small

amount samples made the filter ineffective, however too many samples introduced lag to the

system. The amount of samples to provide a sufficient amount of filter, as shown in Figure 5,

while also not creating any significant lag, was found to be 10. Implementation of the filters can

be seen in Appendix C.

Sonar Sensor

The sonar sensor performed well, with the exception being when it was facing a corner of the

room which caused reflections. These reflections made the readings from the sonar sensor

higher than what they should have been. However due to the motion of the robot, readings while

it is facing the corner are not used so this was not an issue, and the sonar did not require any

filtering.

3.1.4 Calibration

MPU

Gyroscopes tend to drift over time, which resulted in varied readings for the heading of the robot

during testing of the MPU. To correct this issue, the magnetometer data was fused together with

the gyroscope data using the on-board Digital Motion Processor to obtain Euler angles [1].

However, due to the magnetic pull of electrical equipment both on the robot and in the room that

the robot would be running in, the magnetometer was not giving correct readings. Calibration of

the magnetometer was implemented by moving the MPU in all directions until minimum and

maximum values for the magnetometers in all 3 axes were obtained. Once these values were

Figure 5 - Comparison of filtered and unfiltered IR sensor readings

0

200

400

600

800

1000

1200

1 5 9

13

17

21

25

29

33

37

41

45

49

53

57

61

65

69

73

77

81

85

89

93

97

10

1

10

5

10

9

11

3

11

7

Comparison of Filtered vs Unfiltered IR data

Filtered Raw

Page 12: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

7

known, they were then able to be used to re-centre the magnetometer readings and calibrate

this sensor. This calibration was carried out while it was situated on the robot and in the

conditions it was required to operate in. This resulted in the readings for the magnetometer

becoming much more accurate and usable as a correction tool for the gyroscope.

3.2 System Architecture

Figure 6 - System Architecture

Page 13: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

8

Figure 6 details the system architecture of the design. The sensor readings are all sent through

the Bluetooth module to the computer. The computer implemented the SLAM algorithm, and

produced a map to display the position of the robot, and a map of the room indicating where

obstacles are and the path that the robot has taken. The sensor readings are also sent to the

FSM where the state decides what the motors need to be doing in order to perform the required

movement, and sends a voltage through to the motor drivers.

3.3 Control

3.3.0 Overview

The robot navigates around the room in a spiral shape moving inwards, shown in Figure 7, as it

allows for fast coverage and mapping of the whole area. A zigzag shaped path was considered,

however for mapping purposes the robot needed to drive around in a loop to make an initial

scan of the walls, resulting in same the area being covered twice. Another advantage of this

spiral is that it permits a design in which sensors are only required to be mounted on two sides

of the robot, the front and wall side.

The control of the robot is carried out using a FSM machine, shown in Figure 8 and detailed

below. See Appendix C for relevant code.

Figure 7 - Spiral robot path

Page 14: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

9

Figure 8 - FSM

Page 15: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

10

3.3.1 Initialisation State

Initialisation is the state that the robot starts in. The robot will rotate until the side mounted

sensors are reading values within a margin of 20% of their average from each other, as shown

in Figure 9, which indicates that a wall has been found. Due to the sensor placement, this

condition is not met if they are sensing an obstacle rather than a wall. This ensures that an

obstacle will not be accidentally classified as a wall, and prevents problems with the

initialisation. Once the wall has been found, the robot strafes toward the wall and while staying

square with it. The exit condition is met when the robot is within 40mm of the wall, and the state

changes to Driving Forward.

3.3.2 Driving Forward State

The robot drives forward while keeping square with the wall using

the side facing sensors, remaining in this state until it detects

either a wall or an obstacle in front of it. If the robot detects a wall,

as shown in Figure 10, which occurs when the three forward

facing sensors are reading a measurement below 130mm and are

all within the margin of 50% from each other, it will exit to the

Corner state. If the centred sonar sensor and only one of the side

sensors are reading a similar value, as shown in Figure 11, then

an obstacle is in front of the robot, and it will exit into the Check if

Clear state.

When the robot is driving the inner loops of the spiral, obstacles that were avoided on the outer

loops are in the way of the wall. This created issues with squaring to the wall, as the IR sensors

would be reading off of the obstacle instead. This issue was minimised by adding in a check for

a side obstacle, which determined if one of the IR sensors was reading a much smaller value

than the other. If this check showed a side obstacle, the robot uses only the larger sensor value

rather than an average between the two sensors.

3.3.3 Corner State Due to the robot driving close to the wall in order to maximise coverage, the robot needs to

move away from the wall before turning in order to prevent a collision. The robot reverses and

strafes left for 100ms, once it is at this safe distance from the wall it will begin to turn left around

the corner. After turning 90 degrees, the robot will exit to Square and Strafe.

Figure 9 - Robot rotation during initialisation

Figure 10 - Corner detected

Figure 11 - Obstacle detected

Page 16: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

11

3.3.4 Check if Clear State

The robot has detected that there is an obstacle in front of it and needs to

strafe sideways to get around it, however due to the mounting positions of

the sensors the robot cannot see if the path is clear. To begin the robot

reverses for 200ms, this is in order to get a safe distance back from the

object it has detected before attempting to drive around it. This state makes

the robot turn left for 90 degrees in order to check that the path is clear,

shown in Figure 12. In the case that there is another obstacle boxing the

robot in and the path is not clear, it will strafe sideways until the forward

facing sensors are not detecting the obstacle anymore. The robot will then

exit to the Turn Right state.

3.3.5 Turn Right State

The robot will turn 90 degrees to the right so that it is in the same position as it was before the

Check if Clear state. It will use the side facing sensors to get back square with the wall to

ensure that it is correctly orientated for the rest of the obstacle avoidance manoeuvre, and then

will exit into the Strafe Left state.

3.3.6 Strafe Left State

The robot will strafe left using information from the forward facing

sensors to guide the robot clear of the obstacle, when there is a clear

path to move past it, as shown in Figure 13. Once the robot has

finished this and is square with the wall, it will exit into the Drive Past

Obstacle state.

3.3.7 Drive Past Obstacle State

In this state, the robot drives forward until the sonar reading shows that

the robot has moved forward by 300mm and the side facing sensors

show that the robot is clear of the obstacle, as shown in Figure 14. The

robot will then exit to the Square and Strafe state. However, if the robot

detects a wall in front of it, the Corner state will be entered.

If the side facing sensors produce noisy and incorrect readings, the

state machine may never escape this state. To avoid this interfering

too significantly with the robot’s operation, a timeout has been added to exit to the Square and

Strafe state.

3.3.8 Square and Strafe State The sensors were found to be more accurate at close distances, for this reason the robot goes

right up to the wall after turning a corner or avoiding an obstacle, then strafe back to the correct

offset from the wall. This ensures that the robot is correctly squared to the wall, and will move in

a straight line at the correct distance from the wall. The robot uses the side facing sensors in

order to get square with the wall, by determining if the two readings are within a certain range of

Figure 12 - Check if clear

Figure 13 - Strafe left

Figure 14 - Drive past obstacle

Page 17: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

12

each other. It will also strafe to the wall until it as the correct offset, after which it will exit to the

Driving Forward state.

3.5 SLAM 3.5.0 Approach

Carrying out SLAM on board the Arduino Mega would have allowed better navigation decisions to be made by the robot, however due to the low amount of memory (8kB) available and the low clock speed of the chip, it was determined to be impractical within the project constraints. Instead, a passive form of localisation and mapping was implemented on an off-board computer using C# and Unity to perform computational tasks and produce 3D visualisations of the robot pose and map respectively.

In order to further simplify the SLAM process, an online slam pipeline was selected, as in Figure 15 [2]. This method only seeks to recover the most recent pose of the robot based on current observations, as opposed to recovering the entire pose history of the robot and updating the map accordingly. The map building component of slam involves the superposition of sensor readings onto the robot’s estimated position, therefore the accuracy of the map is intrinsically linked to the accuracy of the robot’s pose estimate.

3.5.1 Implementation

Due to the restrictions and challenges induced by low quality sensors, with respect to laser sensing, and their sparse sampling in 2D space, the localisation and mapping algorithm relies heavily on the states of the robot to accurately determine the robot position and the features in the map. The overall approach for SLAM is to use dead reckoning to estimate the robot pose during the outer loop circuit and create a map of the walls. All pose estimation of that point onwards is made with reference to the walls that are mapped on the first loop, in order to achieve global consistency within the map. A geometric map is produced, and is represented by an occupancy grid and features. Dead Reckoning

As there is no recorded map, and therefore reference points, for localisation upon initialization of the robot, the first pose estimate of the robot is determined through the process of dead reckoning. In order to determine the change in local forwards and lateral position of the robot at any given time, the change in the forwards and side facing sensors are applied as a local transform with respect to the robot’s orientation at that particular instant, as in Figure 16. The sonar sensor provides the most stable estimate of the changing environment in front of the robot

Figure 15 - SLAM diagram

Page 18: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

13

(relative to the two forwards facing IR sensors), therefore it is used to determine the change in forwards position. An average of the two side facing IR sensors is used to provide information about the robot’s lateral translation. Robot heading information is provided by both the MPU and the known state and geometry of

the map. The angular position data captured by the MPU is affected by drift, making it too inaccurate to use with dead reckoning. This inaccuracy is amplified by the fact that error is accumulated throughout the dead reckoning localisation process, as depicted in Figure 17. An approach was adopted where the robot orientation was fixed to be parallel with the wall next to it, meaning that the robot pose estimate could only be in four discrete states of orientation (0, 90, 180 and 270 degrees). An exception to this was when the robot entered the Corner State which has no translational motion. The MPU data was used during this state to estimate the robot orientation, as shown in Figure 18. See Appendix D for code relevant to this process.

Wall Mapping

In order to provide a solid map framework from which to determine the position of the robot relative to the map, all of the walls are mapped in 2D space during the outer loop of the room. For each discrete wall in the rectangular map, a log is taken of all of the data from the side facing IR sensors on the robot. Following the completion of driving along that wall, a regression line fit with a fixed gradient is undertaken, from which the appropriate axis intercepts are calculated, as in Figures 19 and 20. Upon completion of the full outer lap, the corner points of the map are all determined by solving each set of intersecting lines as a pair of simultaneous equations. See Appendix E for code relevant to this process.

Figure 17 - Incorrect path due to error accumulation Figure 18 - Orientation of robot using MPU

Figure 16 - Direction of movement relative to robot

Page 19: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

14

Localisation

Now with a reference frame for the outside of the rectangular map, an estimate of the robot’s pose can be made by associating the current sensor observations with the map. This has the added advantage of producing a more globally consistent pose estimate than with dead reckoning. While the robot is in its uninterrupted forwards driving state, the sonar sensor readings are subtracted from the known position of the wall to determine the forwards position of the robot, as in Figure 21. Similarly, in the strafing states, an average of the two side facing IR sensors is used to determine the lateral distance of the robot from the wall, as in Figure 22. See Appendix F for code relevant to this process.

Figure 19 - Regression fit equation

Figure 20 - Regression line fit on map

Figure 21 - Mapping of robot in forwards driving state

Page 20: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

15

Figure 22 - Mapping of robot in strafing state

This strategy for localisation is robust in the case where there are no obstacles present, but in the case where an obstacle is placed in front of the robot, it will cause the algorithm to think that the obstacle is effectively a wall. The estimate of the robot’s pose will to be much closer to the wall than it actually is. To deal with this case, a check is implemented where the pose of the robot is considered for the state that it is in. For example, if the robot is in its forward driving state, the limits of how far forward the robot can realistically travel is known. In such a case where the pose estimate exceeds these limitations, a catch is implemented where the robot localisation method switches to dead reckoning to estimate its change in pose, as shown in Figure 23. The localisation method is reset to localise from the map when the state change indicates that the robot has finished driving along a straight.

Obstacle Mapping

There are three obstacles to be mapped out within the environment, and with this known constraint, a systematic method of determining their location was developed. In each case that the robot enters into the Drive Past Obstacle state, the side IR sensors can be used to determine the most probable location of the obstacle. While in this state, a set of 2D points are logged from the IR sensors when they are deemed to be perceiving the obstacle, as shown in Figure 24. Upon exit of this state, the average of these 2D points is computed, and the obstacle is placed at this point, as shown in Figures 25 and 26. In the case that all three obstacles have already been mapped out, and the robot enters into the Drive Past Obstacle state again, the

Figure 23 - Robot pose from dead reckoning when an obstacle is detected

Page 21: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

16

distance between the position of the new obstacle and all three existing obstacles is computed. The shortest distance indicates which original obstacle the robot is remapping, and upon exit of the Drive Past Obstacle state, the average between the original and new obstacles is computed, and the position of the original obstacle is updated to be at this location. See Appendix G for code relevant to this process.

Geometric Map

The geometric map of the robot coverage area and the obstacles can be represented in two different ways. The first is with a feature based map which contains obstacles, the walls of the map, and the robot trajectory, as shown in Figure 27. This represents the features of the map in a continuous, millimetre equivalent domain. The second means of representing the map is with a discretized occupancy grid, with a resolution of 50mm squared for each discrete area of the map, as shown in Figure 28. This map contains information about the maps walls, the robot area covered with its sensors and obstacle observations. Both these maps are combined to produce a map which contains the most relevant information and features, as shown in Figure 30. See Appendix H for code relevant to these processes.

Figure 27 - Feature based map

Figure 24 - Obstacle sample points from side IR sensors

Figure 26 - Position of obstacle calculated

Figure 25 - Obstacle positioned on map

Page 22: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

17

6.0 Testing, Results and Discussion 6.1 Testing A critical design step was the testing. There were multiple stages of testing that occurred, these

were the sensor testing and robot performance testing. The sensors required testing for

accurate characterising and also for performance. Testing involved isolating the sensors in

controlled environments, for example where there was low noise and low light.

6.1.0 Robot Performance Testing

Robot performance testing allowed the robot movements to be tuned and the sensors to be

calibrated resulting in better readings and manoeuvring by the robot. This was achieved by

placing the robot and obstacles in random positions and monitoring its performance. The robot’s

interpretation of its surroundings was evaluated using the serial monitor on Arduino IDE. Slight

changes to how the robot reacts to different situations improved the performance of the robot,

resulting in no collisions and faster run times.

The right rear wheel of the robot had a slight collision with walls when it was turning at the

corner of the room, as shown in Figure 29. To fix this issue, the robot was programmed to

perform a small strafe sideways so that it is away from the wall before the turn is made.

Figure 29 - Wheel collision with wall during cornering

Figure 28 - Discretized occupancy grid map

Page 23: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

18

When passing an obstacle to the side of the robot while performing an inner loop, the robot

would sometimes classify the obstacle as a wall. Due to the noise of the IR sensor readings,

there was a limit to how much the difference between what is classed as a wall and an obstacle

could be decreased. This issue was minimised as much as possible, but obstacles are still

occasionally classed as walls.

The robot sometimes didn’t drive far enough past an obstacle, and once it exited the drive

forward stage of obstacle avoidance, it would be squaring with the obstacle rather than the wall

and jitter back and forth. This issue was fixed by making the robot drive forward for a longer

amount of time.

The robot would continue driving after it had already covered the whole area. The robot stops

when it has completed a certain amount of corners, however some of the corners were not

being picked up. This issue was caused by the obstacles being placed so close to a corner that

the robot wouldn’t pick up the corner when it moved onto the next wall, as it would have

cornered during obstacle avoidance. To fix this, the absolute angle from the MPU was used as a

second check for if the robot had covered the full area.

6.1.1 Sensor Testing

The IR sensors were tested in isolation where the room was darkened. A tape measure was laid

on the ground from the reference (zero) mark against a wall to its approximate maximum range

of 2000 mm. From there, the robot was placed at reference and measurements from the

sensors were read from the serial monitor of Arduino.

At small distances from reference, it was expected that the sensor readings would drop

significantly as per the inverse square characterisation of the IR. This led to small distance

increments in the lower distances and longer increments in later distances. This sensor data

was then interpolated in MATLAB. This interpolation was used to determine accurate values for

a lookup table the sensor could then use during live operation.

6.2 Results During the demonstration the robot was placed at an angle of around 30o to the wall. The robot

successfully aligned itself against the first wall during the initialisation stage of the run, and

remained square to the wall as it navigated around the room in the Driving Forward state. The

robot went up closely to the corners, and then moved away before making the turn and avoided

colliding with the walls. The robot successfully avoided all obstacles, going up very close and

then strafing around them. It made no collisions with any of the walls or obstacles during the

entire run, and achieved a high level of area coverage in a small amount of time. The

performance of the robot was the same for both runs, and was consistent with its performance

during testing, proving the robustness of the design and implementation.

Below is a brief summary of the performance:

Page 24: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

19

Run Time (minutes) Obstacles Collisions Coverage

1 1:25 3 0 80-90%

2 1:32 3 0 80-90%

Table 2 - Performance during demonstration

The recorded robot trajectory, mapping of the walls, obstacles and robot position from the

demonstration is shown in Figure 30.

Mapping can also be done in real time. The video on the following link shows a comparison of

the robot during the demonstration and the mapping being done in real time:

https://youtu.be/QTZiG2FM4KM

6.3 Discussion The initial design of the drive past object state used a snapshot of the sonar once it had successfully strafed. This snapshot was used as a reference to use against the decrementing sonar value as the robot drove forward past the object. However, this was found to be unreliable and the robot would intermittently escape the pass object condition too early and then interpret the object next to it as a wall. The robot would therefore attempt to square against an irregular object (as opposed to a wall) and would become stuck. This was resolved by including a timeout escape condition to the Drive Past Obstacle State. This ensured that the robot would not remain stuck in the state. The robot had a slight issue when passing an obstacle on the side for the one obstacle that was positioned close against the wall. This caused the robot to classify the obstacle as a wall, and caused the run to be slightly slower as it did not drive completely straight during that time. There was no risk of a collision, however this behaviour is not ideal. This could be fixed by decreasing the tolerance between what is classed as a wall and a side object, however sensor data would need to be filtered better for this to not cause other issues with the robot’s performance.

Figure 30 - Mapping of robot and room from demonstration

Page 25: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

20

During testing, the serial monitor in the Arduino module would intermittently crash. Its cause was discovered to be a faulty MPU and replacing this unit immediately resolved the issue. The benefit of using statically-mounted sensors was speed of operation. Very near objects therefore introduced the chance of collisions as there were parts of the robot which would traverse paths which the sensors had not detected. This was acceptable because the object needed to within a few millimetres to cause a risk of obstruction. During testing, this was sometimes an issue, however, no major collisions were caused and the robot merely swept passed obstacles. This was therefore considered a non-issue. Many assumptions were made to simplify the SLAM problem and improve the overall performance of the robot. As the room was expected to be four-walled, the spiral pattern allowed a fixed maximum angle of 1080 degrees (three full rotations). To remove the robot from this application and into more complex environments would mean that this assumption would need to be reassessed. Online and Full SLAM was unfeasible due to the computational complexity of these two problems. It was therefore decided to remotely build a map passively (i.e. Passive SLAM) and implement a reactive control system for the robot. This allowed the computational intensity to be offloaded onto a PC giving the microprocessor entirely to the control. Future work may involve the use of a more powerful microprocessor to allow SLAM-based decision making.

9.0 Conclusions A SLAM robotic vacuum cleaner was designed and implemented using a VEX robotic

platform, IR and sonar sensors and a MPU.

The sensor readings were filtered and calibrated to get accurate measurements.

Testing and tuning of the robot’s performance for driving and avoiding collisions resulted

in a successful design that achieved the aims of this project.

Data collected by the robot, along with the state and dead reckoning allowed accurate

tracking of the robot’s path, and a precise map to be made of the room and obstacles

The overall performance of the robot during the demonstration was a success. It made

no collisions, and achieved close to full coverage in only 1.25 minutes.

Future improvements to the design could be made to make the robot more robust and

able to handle a wider variety of rooms and obstacles. Changes include the removal of

assumptions about the room, such as it having 4 walls, and the testing of more obscure

cases.

Page 26: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

I

References 1. MPU-9150 9-Axis Data Fusion. Pansenti. 2013.

2. Stachniss, C. Robot Mapping. Available: http://ais.informatik.uni-

freiburg.de/teaching/ws12/mapping/pdf/slam01-intro-4.pdf. Accessed: 5/6/16.

Appendices Appendix A - Parts list

● 1 x Mecanum wheel robot kit

○ 1 x VEX robot chassis

○ 1 x Arduino Mega

○ 4 x Motors

○ 4 x Motor Drivers

○ 4 x Mecanum wheels

● 1 x MPU 9150 Breakout Board

● 2 x GP2Y0A02YK Long range IR sensors

● 2 x GP2Y0A21YK Medium range IR sensors

● 1 x HC-SR04 Sonar Sensor

● 1 x Bluetooth module

● 1 x Arduino Shield

Appendix B - Circuitry/Wiring

Page 27: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

II

Appendix C – Robot Control Code // Overall state machine for control of the robot.

switch (state) {

// Initialisation state.

case 0:

if (averageWall < ((wall_offset + ((wall_offset / wall_correction)))) and

(wall_offset

< (averageWall + ((wall_offset / wall_correction)))) and (frontIRdis <

((backIRdis +

(wall_offset / square_correction)))) and (backIRdis < (frontIRdis +

((wall_offset /

square_correction)))))

{

state = 1;

pos_millis = millis();

constant_angle = 0;

side_object = 0;

turn_count = 0;

wall_offset = 40;

speed_square = 100;

speed_wall = 100;

offset_angle = absolute;

}

square();

wall();

Break;

// Drive forward and maintain squareness with wall.

case 1:

if (side_object)

{

drive();

}

else if (averageWall < ((wall_offset + ((wall_offset / wall_correction)) *

1.5)) and

(wall_offset < (averageWall + ((wall_offset / wall_correction)) * 1.5)) and

(frontIRdis

< ((backIRdis + (wall_offset / square_correction)) * 1.5)) and (backIRdis <

(frontIRdis

+ ((wall_offset / square_correction)) * 1.5)))

{

drive();

wall();

square();

}

if (((rightIRdis < (end_offset-50)) and millis()>(exit_millis+1000)) or

(leftIRdis <

(end_offset-40) or (sonardis < (end_offset))))

{

if (rightIRdis >= (sonardis*2))

{

state = 3;

strafe_dis = 250;

exit_millis = millis();

}

else if (leftIRdis >= (sonardis*2))

{

Page 28: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

III

state = 3;

strafe_dis = 180;

exit_millis = millis();

}

else if (sonardis >= (leftIRdis*2))

{

state = 3;

strafe_dis = 310;

exit_millis = millis();

}

else if (sonardis >= (rightIRdis*2))

{

state = 3;

strafe_dis = 100;

exit_millis = millis();

}

else

{

state = 2;

print_sensor();

exit_millis = millis();

}

}

else {

square();

wall();

}

break;

// 90 degree corner turn.

case 2:

if (millis() < exit_millis+(reverse_time-100))

{

reverse();

strafeLeft();

}

else if (millis() < exit_millis+(reverse_time-100)+corner_turn)

{

corner();

}

else if (millis() > exit_millis+(reverse_time-100)+corner_turn)

{

state = 7;

wall_capture = averageWall;

exit_millis = millis();

turn_count = turn_count+1;

if (turn_count > 12)

{

kill = true;

}

else if (turn_count > 8)

{

wall_offset = 425;

}

else if (turn_count > 4)

{

wall_offset = 230;

speed_square = 150;

Page 29: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

IV

wall_correction = 10;

square_correction = 10;

speed_wall = 250;

}

if (constant_angle<269)

{

constant_angle = constant_angle + 90;

}

else{

constant_angle = 0;

}

}

break;

// Move past obstacle.

case 3:

if (millis() < exit_millis+reverse_time)

reverse();

else if (millis() < exit_millis+left_turn+reverse_time)

corner();

else if ((millis() > exit_millis+left_turn+reverse_time) and ((rightIRdis >

(safe_strafe)) or (leftIRdis > (safe_strafe)) or (sonardis > (safe_strafe))))

{

state = 4;

exit_millis = millis();

}

else if ((millis() > exit_millis+left_turn+reverse_time) and !((rightIRdis >

(safe_strafe)) or (leftIRdis > (safe_strafe)) or (sonardis > (safe_strafe))))

{

strafeLeft();

}

break;

//

case 4:

if (millis() < exit_millis+right_turn)

{

turnRight();

}

if (millis() > exit_millis+right_turn)

{

square();

wall();

}

if ((millis() > exit_millis+right_turn) and (averageWall < ((wall_offset +

((wall_offset / wall_correction)))) and (wall_offset < (averageWall +

((wall_offset /

wall_correction)))) and (frontIRdis < ((backIRdis + (wall_offset /

square_correction)))) and (backIRdis < (frontIRdis + ((wall_offset /

square_correction))))))

{

state = 5;

wall_capture = averageWall;

exit_millis = millis();

}

break;

// Strafe left away from wall.

case 5:

if (wall_offset+strafe_dis > (averageWall + (wall_offset / wall_correction)))

Page 30: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

V

{

strafeLeft();

square();

}

else

{

exit_millis = millis();

exit_sonar = sonardis;

exit_leftIR = leftIRdis;

state = 6;

}

break;

// Drive past obstacle.

case 6:

drive();

if (sonardis<120)

{

exit_millis = millis();

state = 2;

}

if ((millis() > exit_millis+1600))

{

state = 7;

wall_capture = averageWall;

exit_millis = millis();

}

break;

case 7:

square();

if ((frontIRdis < ((backIRdis + (wall_offset / square_correction)) * 1.5))

and

(backIRdis < (frontIRdis + ((wall_offset / square_correction)) * 1.5)))

{

wall();

}

if (averageWall < ((wall_offset + ((wall_offset / wall_correction)))) and

(wall_offset

< (averageWall + ((wall_offset / wall_correction)))) and (frontIRdis <

((backIRdis +

(wall_offset / square_correction)))) and (backIRdis < (frontIRdis +

((wall_offset /

square_correction)))))

{

pos_millis = millis();

state = 1;

exit_millis = millis();

}

break;

default:

LFspeed = 1500;

LRspeed = 1500;

RRspeed = 1500;

RFspeed = 1500;

break;

}

if (kill)

{

Page 31: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

VI

LFspeed = 1500;

LRspeed = 1500;

RRspeed = 1500;

RFspeed = 1500;

}

drive_motors();

}

// Filter all of the infrared sensor readings.

void medianfilter ()

{

// Read infrared values.

leftRaw = analogRead(A1);

rightRaw = analogRead(A3);

frontRaw = analogRead(A4);

backRaw = analogRead(A2);

// Left IR.

if (leftRaw > 633)

leftmm = 0;

else if (leftRaw < 34)

leftmm = 2000;

else

leftmm = pgm_read_word(&(leftLUT[leftRaw - 34]));

// Right IR.

if (rightRaw > 640)

rightmm = 0;

else if (rightRaw < 36)

rightmm = 2000;

else

rightmm = pgm_read_word(&(rightLUT[rightRaw - 36]));

// Front IR.

if (frontRaw > 632)

frontmm = 0;

else if (frontRaw < 67)

frontmm = 1000;

else

frontmm = pgm_read_word(&(frontLUT[frontRaw - 67]));

// Back IR.

if (backRaw > 630)

backmm = 0;

else if (backRaw < 67)

backmm = 1000;

else

backmm = pgm_read_word(&(backLUT[backRaw - 67]));

// Put sensor values through the median filter.

leftMedian.add(leftmm);

rightMedian.add(rightmm);

frontMedian.add(frontmm);

backMedian.add(backmm-5);

// Get output values from filter.

leftMedian.getMedian(leftIRdis);

rightMedian.getMedian(rightIRdis);

frontMedian.getMedian(frontIRdis);

backMedian.getMedian(backIRdis);

}

Page 32: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

VII

Appendix D – Dead Reckoning Code void determineRobotAngle()

{

if (mappingState == 3)

{

/* Use windowed detector to check whether robot turn is not caught by robot state

*/

classifyTurn();

}

int state = SerialObject.GetComponent<Serial>().state;

float deltaW;

/* When robot is in turning state, use IMU data for robot angle */

if (state == 2)

{

if (firstAngleLoop_flag)

{

angleFactor = SerialObject.GetComponent<Serial>().IMU - previousW;

firstAngleLoop_flag = false;

}

currentW = SerialObject.GetComponent<Serial>().IMU - angleFactor;

deltaW = previousW - currentW;

previousW = currentW;

}

else /* Otherwise use angle based on robot state */

{

firstAngleLoop_flag = true;

currentW = SerialObject.GetComponent<Serial>().angle + angleCorrectionFactor;

deltaW = previousW - currentW;

previousW = currentW;

}

Vector3 tempRotation = robotRotation.eulerAngles;

tempRotation.y += deltaW;

robotRotation.eulerAngles = tempRotation;

}

float GetSensorReadingX()

{

return ((SerialObject.GetComponent<Serial>().front +

SerialObject.GetComponent<Serial>().back) / 2) + 105;

}

float GetSensorReadingY()

{

return (SerialObject.GetComponent<Serial>().sonar) + 100;

}

void deadReckoning()

{

float deltaX = 0;

float deltaY = 0;

if (state == 1 || state == 6)

{

/* if robot is driving forward, only consider change in local y direction */

currentY = GetSensorReadingY();

deltaY = previousY - currentY;

previousY = currentY;

previousX = GetSensorReadingX();

}

else if (state == 7 || state == 5)

{

/* if robot is strafing, only consider change in local x direction */

previousY = GetSensorReadingY();

Page 33: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

VIII

currentX = GetSensorReadingX();

deltaX = previousX - currentX;

previousX = currentX;

}

/* Consider driving process by limiting detected change in forwards direction */

if (deltaY < 0 || deltaY > processThreshold)

{

deltaY = processVariable;

}

/* Apply local transform with respect to robot orientation */

Vector3 deltaPosition = new Vector3(deltaX, 0, deltaY);

robotPosition += robotRotation * deltaPosition;

}

Appendix E – Wall Mapping Code

void createWallXAxis(List<Vector2> wall, out double slope, out double yintercept)

{

yintercept = 0; slope = 0;

try

{

double rsquared;

LinearRegression(wall, true, out rsquared, out yintercept, out slope); // y = mx +

c

print("Wall Fitted along X Axis, y = " + slope + "*x + " + yintercept + ", with

r^2 = " + rsquared);

}

catch { print("LinearRegression Failed..."); }

}

void createWallZAxis(List<Vector2> wall, out double slope, out double xintercept)

{

xintercept = 0; slope = 0;

try

{

double rsquared;

LinearRegression(wall, false, out rsquared, out xintercept, out slope); // x = my

+ c

print("Wall Fitted along Y(Z) Axis, x = " + slope + "*y + " + xintercept + ", with

r^2 = " + rsquared);

}

catch { print("LinearRegression Failed..."); }

}

static void LinearRegression(List<Vector2> wall, bool direction, out double rsquared, out double

yintercept, out double slope)

{

double[] xVals = new double[wall.Count];

double[] yVals = new double[wall.Count];

for (int i = 0; i < wall.Count; i++)

{

if (direction == true) // X

{

xVals[i] = wall[i].x;

yVals[i] = wall[i].y;

}

if (direction == false) // Y

{

xVals[i] = wall[i].y;

yVals[i] = wall[i].x;

}

}

Debug.Assert(xVals.Length == yVals.Length);

double sumOfX = 0;

double sumOfY = 0;

Page 34: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

IX

double sumOfXSq = 0;

double sumOfYSq = 0;

double ssX = 0;

double sumCodeviates = 0;

double sCo = 0;

double count = wall.Count;

for (int ctr = 0; ctr < count; ctr++)

{

double x = xVals[ctr];

double y = yVals[ctr];

sumCodeviates += x * y;

sumOfX += x;

sumOfY += y;

sumOfXSq += x * x;

sumOfYSq += y * y;

}

ssX = sumOfXSq - ((sumOfX * sumOfX) / count);

double RNumerator = (count * sumCodeviates) - (sumOfX * sumOfY);

double RDenom = (count * sumOfXSq - (sumOfX * sumOfX))

* (count * sumOfYSq - (sumOfY * sumOfY));

sCo = sumCodeviates - ((sumOfX * sumOfY) / count);

double meanX = sumOfX / count;

double meanY = sumOfY / count;

double dblR = RNumerator / Mathf.Sqrt((float)RDenom);

rsquared = dblR * dblR;

yintercept = meanY - ((sCo / ssX) * meanX);

slope = 0; /* Force gradient to be zero */

}

void determineCornerPoint(out Vector2 XY, double mx, double my, double cx, double cy, bool AorC)

{

/* solve to find all four corner points as a set of X,Y Vectors */

if (AorC)

{

XY.x = (float)((cy * mx + cx) / (1 - mx * my)); // Simultaneous Math

XY.y = (float)(my * XY.x + cy);

}

else // BorD

{

XY.x = (float)((cx * my + cy) / (1 - my * mx)); // Simultaneous Math

XY.y = (float)(mx * XY.x + cx);

}

}

Appendix F – Localisation Code void localiseRobotFromMap()

{

/* We now have information about the walls in the form of X,Y Coordinates of the

Corners(A,B,C,D)using the forwards facing sensors and side facing sensors, determine

robotPosition and robotRotation Also, deal with the cases in which an obstacle appears in front

of, or to the side of the robot, as this will cause the robot to jump back in the map, as it will

think the wall is closer than expected. */

if (state != 2 && state != 4 && !obstacleDeadReckon_flag) /* while not turning */

{

if (state == 1 || state == 6) /* moving forwards */

{

if ((int)robotRotation.eulerAngles.y > 315 ||

(int)robotRotation.eulerAngles.y <= 45) // i.e. along first wall

{

yPos = yMax - GetSensorReadingY();

}

else if ((int)robotRotation.eulerAngles.y > 45 &&

(int)robotRotation.eulerAngles.y <= 135) // i.e. along fourth wall

{

xPos = xMax - GetSensorReadingY();

Page 35: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

X

}

else if ((int)robotRotation.eulerAngles.y > 135 &&

(int)robotRotation.eulerAngles.y <= 225) // i.e. along third wall

{

yPos = yMin + GetSensorReadingY();

}

else if ((int)robotRotation.eulerAngles.y > 225 &&

(int)robotRotation.eulerAngles.y <= 315) // i.e. along second wall

{

xPos = xMin + GetSensorReadingY();

}

}

else if (state == 5 || state == 7) /* moving laterally */

{

if ((int)robotRotation.eulerAngles.y > 315 ||

(int)robotRotation.eulerAngles.y <= 45) // i.e. along first wall

{

xPos = xMax - GetSensorReadingX();

}

else if ((int)robotRotation.eulerAngles.y > 45 &&

(int)robotRotation.eulerAngles.y <= 135) // i.e. along fourth wall

{

yPos = yMin + GetSensorReadingX();

}

else if ((int)robotRotation.eulerAngles.y > 135 &&

(int)robotRotation.eulerAngles.y <= 225) // i.e. along third wall

{

xPos = xMin + GetSensorReadingX();

}

else if ((int)robotRotation.eulerAngles.y > 225 &&

(int)robotRotation.eulerAngles.y <= 315) // i.e. along second wall

{

yPos = yMax - GetSensorReadingX();

}

}

/* Find overall change in movement */

float deltaX = 0, deltaY = 0;

if ((int)robotRotation.eulerAngles.y == 0) // i.e. along first wall

{

deltaX = xPos - prevxPos;

deltaY = yPos - prevyPos;

}

else if ((int)robotRotation.eulerAngles.y == 90) // i.e. along fourth wall

{

deltaY = xPos - prevxPos;

deltaX = -(yPos - prevyPos);

}

else if ((int)robotRotation.eulerAngles.y == 180) // i.e. along third wall

{

deltaX = -(xPos - prevxPos);

deltaY = -(yPos - prevyPos);

}

else if ((int)robotRotation.eulerAngles.y == 270) // i.e. along second wall

{

deltaY = -(xPos - prevxPos);

deltaX = yPos - prevyPos;

}

/* Check reasonable limits of movement while in forwards driving state */

if (ObstacleCheck(deltaY) && state == 1)

{

obstacleDeadReckon_flag = true;

}

else

{

/* Apply Global Transform */

robotPosition = new Vector3(xPos, 0, yPos);

/* Assign previous values */

Page 36: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

XI

prevxPos = xPos;

prevyPos = yPos;

}

}

else if (obstacleDeadReckon_flag)

{

deadReckoning();

if (state == 2) /* reset the turn */

{

obstacleDeadReckon_flag = false;

xPos = robotPosition.x;

yPos = robotPosition.z;

}

}

}

bool ObstacleCheck(float deltaY)

{

// we generally ant to perform check just after entering state 1

if (deltaY > processFilter)

{

return true;

}

else

{

return false;

}

}

Appendix G – Obstacle Mapping Code

private void detectObstacle()

{

/* Convert Cartesian Position of Sensor Reading of an obstacle to array coordinates,

and store as an obstacle! NB: Occurs every update */

if (state == 6 && !obstacleMapping_flag)

{

obstacleMapping_flag = true;

obstacle.Clear();

}

else if (state != 6 && obstacleMapping_flag)

{

generateObstacle();

obstacleMapping_flag = false;

}

if (obstacleMapping_flag) /* When driving past an obstacle and mapping */

{

recordObstacle();

}

}

private void recordObstacle()

{

/* only add obstacles if distance between wall and scan point is greater than a certain

threshold,

do this for both font and back IR sensors */

if (SerialObject.GetComponent<Serial>().front < 0.5 *

SerialObject.GetComponent<Serial>().back)

obstacle.Add(new Vector2(frontAbsScan.x, frontAbsScan.z));

if (SerialObject.GetComponent<Serial>().back < 0.5 *

SerialObject.GetComponent<Serial>().front)

obstacle.Add(new Vector2(backAbsScan.x, backAbsScan.z));

}

private void generateObstacle()

{

float sumX = 0, sumY = 0;

for (int i = 0; i < obstacle.Count; i++)

Page 37: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

XII

{

sumX += obstacle[i].x;

sumY += obstacle[i].y;

}

float obstacleX = sumX / obstacle.Count;

float obstacleY = sumY / obstacle.Count;

if (obstacleCount == 0)

{

obstacle_1_position = new Vector2(obstacleX, obstacleY);

obstacle_1 = Instantiate(cylindricalObject, new Vector3(obstacleX, 25, obstacleY),

Quaternion.identity) as GameObject;

print("Created 1st Obstacle... " + "x: " + obstacle_1.transform.position.x + "\ty:

" + obstacle_1.transform.position.z);

obstacleCount++;

}

else if (obstacleCount == 1)

{

obstacle_2_position = new Vector2(obstacleX, obstacleY);

obstacle_2 = Instantiate(cylindricalObject, new Vector3(obstacleX, 25, obstacleY),

Quaternion.identity) as GameObject;

print("Created 2nd Obstacle... " + "x: " + obstacle_2.transform.position.x + "\ty:

" + obstacle_2.transform.position.z);

obstacleCount++;

}

else if (obstacleCount == 2)

{

obstacle_3_position = new Vector2(obstacleX, obstacleY);

obstacle_3 = Instantiate(cylindricalObject, new Vector3(obstacleX, 25, obstacleY),

Quaternion.identity) as GameObject;

print("Created 3rd Obstacle... " + "x: " + obstacle_3.transform.position.x + "\ty:

" + obstacle_3.transform.position.z);

obstacleCount++;

}

else if (obstacleCount == 3)

{

/* Don't make new object only modify existing */

float[] distance = new float[3];

distance[0] = Vector2.Distance(obstacle_1_position, new Vector2(obstacleX,

obstacleY));

distance[1] = Vector2.Distance(obstacle_2_position, new Vector2(obstacleX,

obstacleY));

distance[2] = Vector2.Distance(obstacle_3_position, new Vector2(obstacleX,

obstacleY));

float[] distanceSorted = new float[3];

Array.Copy(distance, distanceSorted, distance.Length);

Array.Sort(distanceSorted);

if (distanceSorted[0] == distance[0]) /* if closest to 1st Obstacle */

{

obstacleX = (obstacleX + obstacle_1_position.x) / 2;

obstacleY = (obstacleY + obstacle_1_position.y) / 2;

Destroy(obstacle_1);

obstacle_1_position = new Vector2(obstacleX, obstacleY);

obstacle_1 = Instantiate(cylindricalObject, new Vector3(obstacleX, 25,

obstacleY), Quaternion.identity) as GameObject;

print("Updated 1st Obstacle... " + "x: " + obstacle_1.transform.position.x

+ "\ty: " + obstacle_1.transform.position.z);

}

else if (distanceSorted[0] == distance[1]) /* if closest to 2nd Obstacle */

{

obstacleX = (obstacleX + obstacle_2_position.x) / 2;

obstacleY = (obstacleY + obstacle_2_position.y) / 2;

Destroy(obstacle_2);

obstacle_2_position = new Vector2(obstacleX, obstacleY);

obstacle_2 = Instantiate(cylindricalObject, new Vector3(obstacleX, 25,

obstacleY), Quaternion.identity) as GameObject;

Page 38: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

XIII

print("Updated 2nd Obstacle... " + "x: " + obstacle_2.transform.position.x

+ "\ty: " + obstacle_2.transform.position.z);

}

else if (distanceSorted[0] == distance[2]) /* if closest to 3rd Obstacle */

{

obstacleX = (obstacleX + obstacle_3_position.x) / 2;

obstacleY = (obstacleY + obstacle_3_position.y) / 2;

Destroy(obstacle_3);

obstacle_3_position = new Vector2(obstacleX, obstacleY);

obstacle_3 = Instantiate(cylindricalObject, new Vector3(obstacleX, 25,

obstacleY), Quaternion.identity) as GameObject;

print("Updated 3rd Obstacle... " + "x: " + obstacle_3.transform.position.x

+ "\ty: " + obstacle_3.transform.position.z);

}

}

}

Appendix H – Occupancy Grid Mapping Code

void drawWallsInMap()

{

for (int i = 0; i <= 100; i++)

{

float fraction = ((float)i / 100);

float X_AB = A.x - (fraction * (A.x - B.x));

float Y_AB = A.y - (fraction * (A.y - B.y));

Vector2 ABarrayPoint = WorldToArray(new Vector2(X_AB, Y_AB));

float X_BC = B.x - (fraction * (B.x - C.x));

float Y_BC = B.y - (fraction * (B.y - C.y));

Vector2 BCarrayPoint = WorldToArray(new Vector2(X_BC, Y_BC));

float X_CD = C.x - (fraction * (C.x - D.x));

float Y_CD = C.y - (fraction * (C.y - D.y));

Vector2 CDarrayPoint = WorldToArray(new Vector2(X_CD, Y_CD));

float X_DA = D.x - (fraction * (D.x - A.x));

float Y_DA = D.y - (fraction * (D.y - A.y));

Vector2 DAarrayPoint = WorldToArray(new Vector2(X_DA, Y_DA));

try

{

array[(int)ABarrayPoint.x, (int)ABarrayPoint.y] = mapWall;

array[(int)BCarrayPoint.x, (int)BCarrayPoint.y] = mapWall;

array[(int)CDarrayPoint.x, (int)CDarrayPoint.y] = mapWall;

array[(int)DAarrayPoint.x, (int)DAarrayPoint.y] = mapWall;

}

catch { }

}

}

void occupancyGrid(float x, float y, string identifier)

{

if ((x > xMin) && (x < xMax) && (y > yMin) && (y < yMax)) // perform check in world

coordinates

{

Vector2 _ = WorldToArray(new Vector2((int)x, (int)y)); // convert to array

coordinates

try

{

if (identifier == "obstacle")

{

if (array[(int)_.x, (int)_.y] != mapWall)

array[(int)_.x, (int)_.y] += mapExplored;

}

if (identifier == "clear")

{

Page 39: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

XIV

if (array[(int)_.x, (int)_.y] == mapUnexplored) // only assign

clear if unexplored

array[(int)_.x, (int)_.y] = mapClear;

}

}

catch

{}

}

}

void detectObstacles()

{

if (state == 6)

{

// only add obstacles if distance between wall and scan point is greater than a

certain threshold, for each sensor!

if (SerialObject.GetComponent<Serial>().front < 0.5 *

SerialObject.GetComponent<Serial>().back)

occupancyGrid(frontAbsScan.x, frontAbsScan.z, "obstacle");

if (SerialObject.GetComponent<Serial>().back < 0.5 *

SerialObject.GetComponent<Serial>().front)

occupancyGrid(backAbsScan.x, backAbsScan.z, "obstacle");

}

}

void detectAllClear()

{

detectClear(SerialObject.GetComponent<Serial>().sonar, sonarRelPos, true);

detectClear(SerialObject.GetComponent<Serial>().front, frontRelPos, false);

detectClear(SerialObject.GetComponent<Serial>().back, backRelPos, false);

detectClear(SerialObject.GetComponent<Serial>().left, leftRelPos, true); // Left and

Right IR too noisy!

detectClear(SerialObject.GetComponent<Serial>().right, rightRelPos, true);

}

void detectClear(float SensorData, Vector3 sensorRelPos, bool y)

{

for (int i = 0; i <= iterateResolution; i++)

{

var SensorRange = ((float)i / iterateResolution) * SensorData;

Vector3 _RelScan;

if (y)

_RelScan = new Vector3(0, 0, SensorRange);

else

_RelScan = new Vector3(SensorRange, 0, 0);

Vector3 _AbsScan = robotRotation * ((sensorRelPos + _RelScan)) + robotPosition;

occupancyGrid(_AbsScan.x, _AbsScan.z, "clear");

}

}

void DisplayMap()

{

for (int x = 0; x < mapX; x++)

{

for (int y = 0; y < mapY; y++)

{

switch (array[x, y])

{

case -1: // unexplored

break;

case 0: // clear

Vector2 world2DCoordinates_Clear = ArrayToWorld(new

Vector2(x, y));

Instantiate(clearPoint, new

Vector3(world2DCoordinates_Clear.x, -25, world2DCoordinates_Clear.y), Quaternion.identity);

break;

case -2: // wall

Page 40: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

XV

Vector2 world2DCoordinates_Wall = ArrayToWorld(new

Vector2(x, y));

Instantiate(wallPoint, new

Vector3(world2DCoordinates_Wall.x, -15, world2DCoordinates_Wall.y), Quaternion.identity);

break;

}

if (array[x, y] >= 1)

{

Vector2 world2DCoordinates_Map = ArrayToWorld(new Vector2(x, y));

Instantiate(mapPoint, new Vector3(world2DCoordinates_Map.x, -25,

world2DCoordinates_Map.y), Quaternion.identity);

}

}

}

}

void DestroyMap(string tag)

{

GameObject[] gameObjects;

gameObjects = GameObject.FindGameObjectsWithTag(tag);

for (var i = 0; i < gameObjects.Length; i++)

{

Destroy(gameObjects[i]);

}

}

Page 41: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

22.

27

32.57 9.57

25.

73

25.

27

30.93 23

30.93

25.

73

38.50

6

38.50

5

29.75

A A

B B

C C

D D

E E

F F

8

8

7

7

6

6

5

5

4

4

3

3

2

2

1

1

DRAWN

CHK'D

APPV'D

MFG

Q.A

UNLESS OTHERWISE SPECIFIED:DIMENSIONS ARE IN MILLIMETERSSURFACE FINISH:TOLERANCES: LINEAR: ANGULAR:

FINISH: DEBURR AND BREAK SHARP EDGES

NAME SIGNATURE DATE

MATERIAL:

DO NOT SCALE DRAWING REVISION

TITLE:

DWG NO.

SCALE:1:5 SHEET 1 OF 1

A3

WEIGHT:

RobotSensors

Sensor Positioning

Page 42: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

44

66

60

4

20

1.5

0 46

1

5

10

25

24

3

15

5

5

A A

B B

C C

D D

E E

F F

8

8

7

7

6

6

5

5

4

4

3

3

2

2

1

1

DRAWN

CHK'D

APPV'D

MFG

Q.A

UNLESS OTHERWISE SPECIFIED:DIMENSIONS ARE IN MILLIMETERSSURFACE FINISH:TOLERANCES: LINEAR: ANGULAR:

FINISH: DEBURR AND BREAK SHARP EDGES

NAME SIGNATURE DATE

MATERIAL:

DO NOT SCALE DRAWING REVISION

TITLE:

DWG NO.

SCALE:2:1 SHEET 1 OF 1

A3

WEIGHT:

Sonar mounting

Page 43: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

4.25

4

2.10

3

37

7.5

0

6

15 5

7

15

3

68

4

5

17

26

61

15

3

37

7.5

0

4

A A

B B

C C

D D

E E

F F

8

8

7

7

6

6

5

5

4

4

3

3

2

2

1

1

DRAWN

CHK'D

APPV'D

MFG

Q.A

UNLESS OTHERWISE SPECIFIED:DIMENSIONS ARE IN MILLIMETERSSURFACE FINISH:TOLERANCES: LINEAR: ANGULAR:

FINISH: DEBURR AND BREAK SHARP EDGES

NAME SIGNATURE DATE

MATERIAL:

DO NOT SCALE DRAWING REVISION

TITLE:

DWG NO.

SCALE:1:1 SHEET 1 OF 1

A3

WEIGHT:

Front right IR

Page 44: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

37

6 7

.50

3

4.25 2.10

4

33

15

70

22

4

4

30

3 15

5

7

2.10 4.25

4

3

A A

B B

C C

D D

E E

F F

8

8

7

7

6

6

5

5

4

4

3

3

2

2

1

1

DRAWN

CHK'D

APPV'D

MFG

Q.A

UNLESS OTHERWISE SPECIFIED:DIMENSIONS ARE IN MILLIMETERSSURFACE FINISH:TOLERANCES: LINEAR: ANGULAR:

FINISH: DEBURR AND BREAK SHARP EDGES

NAME SIGNATURE DATE

MATERIAL:

DO NOT SCALE DRAWING REVISION

TITLE:

DWG NO.

SCALE:1:1 SHEET 1 OF 1

A3

WEIGHT:

Front Left IR

Page 45: MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION …homepages.engineering.auckland.ac.nz/~pxu012/mechatronics2016/... · MECHENG 706: MECHATRONICS DESIGN AND INTEGRATION ... a VEX

4.25 2.10 4

3

37

3

15

3

23

48

63

30

22

30

6

A A

B B

C C

D D

E E

F F

8

8

7

7

6

6

5

5

4

4

3

3

2

2

1

1

DRAWN

CHK'D

APPV'D

MFG

Q.A

UNLESS OTHERWISE SPECIFIED:DIMENSIONS ARE IN MILLIMETERSSURFACE FINISH:TOLERANCES: LINEAR: ANGULAR:

FINISH: DEBURR AND BREAK SHARP EDGES

NAME SIGNATURE DATE

MATERIAL:

DO NOT SCALE DRAWING REVISION

TITLE:

DWG NO.

SCALE:2:1 SHEET 1 OF 1

A3

WEIGHT:

BackIR&MPU