86
Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate School 2004 Autonomous Ground Vehicle Terrain Classification Using Internal Sensors Debangshu Sadhukhan Follow this and additional works at the FSU Digital Library. For more information, please contact [email protected]

Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

Florida State University Libraries

Electronic Theses, Treatises and Dissertations The Graduate School

2004

Autonomous Ground Vehicle TerrainClassification Using Internal SensorsDebangshu Sadhukhan

Follow this and additional works at the FSU Digital Library. For more information, please contact [email protected]

Page 2: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

THE FLORIDA STATE UNIVERSITY

COLLEGE OF ENGINEERING

AUTONOMOUS GROUND VEHICLE TERRAIN CLASSIFICATION USING

INTERNAL SENSORS

By

DEBANGSHU SADHUKHAN

A Thesis submitted to the

Department of Mechanical Engineering

in partial fulfillment of the

requirements for the degree of

Master of Science

Degree Awarded:

Spring Semester, 2004

Page 3: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

ii

The members of the committee approve the thesis of Debangshu Sadhukhan, defended on

March 16, 2004.

_____________________

Carl Moore,

Professor Directing Thesis

_____________________

Emmanuel Collins,

Committee member

_____________________

Rodney Roberts,

Committee member

Approved:

____________________________________________

Chiang Shih, Chair, Ph.D., Department of Mechanical Engineering

____________________________________________

C.J. Chen, Ph.D., Dean, FAMU-FSU College of Engineering

The office of Graduate Studies has verified and approved the above named committee

members.

Page 4: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

iii

To

My Parents

Page 5: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

iv

ACKNOWLEDGEMENTS

I would like to take this opportunity to sincerely thank my advisor Dr Moore for his

guidance in the course of this work and it was only with his support that this manuscript

could see the light of the day.

Dr. Collins has provided many insightful suggestions, which have been instrumental in

improving the quality of this work. I also thank him for his time in serving in my

committee. I wish to thank Dr Roberts for sparing his time and serving in my committee.

I also thank Dr Hollis and Dr Majura for their suggestions during the initial part of this

work.

Colleagues and my close friends in the dynamics group, in particular Shailesh,

Deviprasad and Pankaj have provided invaluable support. I wish to take this opportunity

to thank them and all my other friends in Tallahassee who made my graduate school life

very enjoyable, in particular, Satyajit, Punit and Virendra.

My parents and brother have always been very encouraging and accommodative. I can

never thank them enough for their love and faith.

Special thanks to Dr Sirish Namilae for helping me with the editing of this thesis.

Funding by the Collaborative Technical Alliance is gratefully acknowledged.

Page 6: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

v

TABLE OF CONTENTS

LIST OF TABLES……………………………………………………………………...viii

LIST OF FIGURES………………………………………………………………………ix

ABSTRACT………………………………………………………………………….…..xi

1. INTRODUCTION…………….…………………………………….………………….1

1.1 History of Mobile Robotics Research…………………………………… 1

1.2 The Significance of Mobile Robotics Research…….……………….…….2

1.3 Research Objective………………………………………………………..3

1.4 Justifying Terrain Detection………………………………………………4

1.5 Thesis Objective…………………………………………………………..6

1.6 Chapter Summaries………………………………………………………6

2. LITERATURE REVIEW AND RESEARCH SUMMARY………………………..….7

2.1 Vision Based Terrain Categorization and Traversability Index………………7

2.2 Terrain Parameter Identification via Wheel Terrain Interaction Analysis…….8

2.3 Research Summary……………………………………………………………9

3. TERRAIN IDENTIFICATION USING NEURAL NETWORKS……………………10

3.1 Terrain Identification by Pattern Classification……………………………...10

3.2 Biological Neural Networks…………………………………………………10

Page 7: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

vi

3.3 Artificial Neural Networks…………………………………………………..12

3.4 Network Architectures……………………………………………………….15

3.5 Network Training…………………………………………………………….17

4. PROPOSED ALGORITHM…………………………………………….…………….23

4.1 Purpose of the Research……………………………………………………...23

4.2 Terrain Detection…………………………………………………………….24

4.3 Algorithm Overview…………………………………………………………24

4.4 Development of the Algorithm………………………………………………25

5. EXPERIMENTAL RESULTS……………………………….……………………….32

5.1 Test Bed……………………………………………………………………...32

5.2 Wireless communication for the ATRV-Jr…………………………………..33

5.3 Mobility Programming……………………………………………………….35

5.4 Experimental Procedure……………………………………………………...35

5.5 Experimental Results………………………………………………………...36

5.6 Verification of Results……………………………………………………….46

5.7 Neural Network classification by Backpropagation…………………………50

7. CONCLUSIONS…………………...……………………………………………..…..52

APPENDIX A: C++ CODE TO COLLECT DATA FROM THE INS …………..….54

APPENDIX B: MATLAB CODE TO PLOT FFT ON DIFFERENT SURFACES….59

APPENDIX C: MATLAB/SIMULINK CODE FOR INS CHECK………………….62

APPENDIX D: PNN CLASSIFIER………………………………………………….64

APPENDIX E: BACK PROPAGATION CODE……………………………………..68

REFERENCES……………………………………………………………………….….70

Page 8: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

vii

BIOGRAPHICAL SKETCH……………………………………………………..…….. 73

Page 9: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

viii

LIST OF TABLES

Table 3.1: Comparison of the human brain with digital computers……………………..12

Table 4.1: Values for friction coefficients on different surfaces……………….……….27

Table 4.2: Observations made when a SUV was driven over different terrains…………29

Table 5.1: Error statistics on each surface for each speed level………………………...44

Table 5.2: Error statistics on test site 1……………………………………………….…46

Table 5.3: Error statistics on test site 2……………………………………………….…48

Table 5.4: Error statistics using backpropagation………………………………….……51

Page 10: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

ix

LIST OF FIGURES

Figure 1.1: The XUV……………………………………...………………………...…….3

Figure 1.2: Vehicle on mud………………………………………………………..……...4

Figure 1.3: Vehicle on rocky terrain…………………………………………………...….5

Figure 1.4: Vehicle on snow…………………………………………………………...….5

Figure 3.1: Schematic of a neuron in the human brain………………………………......11

Figure 3.2: Simple neuron model..…………………………………………………….....13

Figure 3.3: Transfer functions…………………………………………………………....14

Figure 3.4: Single-layer feed forward networks………..………………………………..15

Figure 3.5: Multi-layer feed forward networks…………………………………….…….16

Figure 3.6: Training of a neural network………………………………………………...17

Figure 3.7: The organization of a PNN……………………………………………….….23

Figure 4.1: Mobile robot ATRV-Jr………………………………………………….…...25

Figure 4.2: Terrain identification by PNN……………………………………………….26

Figure 4.3: Model of Pioneer II in ADAMS……………………………………………..27

Figure 4.4: ADAMS/View results…………………………………………………….…28

Figure 4.5: Displacement graph……………………………………………………….…30

Figure 5.1: Test bed schematic…………………………………………………………..32

Page 11: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

x

Figure 5.2: Test bed……………………………………………………………………...33

Figure 5.3: ATRV-Jr on gravel……………………………………………………….….33

Figure 5.4: Wireless connectivity outdoors………………………………………….…..34

Figure 5.5: Wireless connectivity indoors…………………………………………….....34

Figure 5.6: Plot of Z-axis acceleration on gravel………………………………….……..36

Figure 5.7: Plot of Z-axis acceleration on grass…………………………………………36

Figure 5.8: Plot of Z-axis acceleration on packed dirt……………………………….…..37

Figure 5.9: FFT of gravel acceleration data……………………………………………...38

Figure 5.10: FFT of packed dirt acceleration data…………………………………….…39

Figure 5.11: FFT of grass acceleration data……………………………………………...40

Figure 5.12: Bar plot for gravel classification…………………………………………...41

Figure 5.13: Bar plot for packed dirt classification………………………………….…..41

Figure 5.14: Bar plot for grass classification………………………………………….…42

Figure 5.15: Results of PNN for gravel……………………………………………….…43

Figure 5.16: Results of PNN for packed dirt………………………………………….…43

Figure 5.17: Results of PNN for grass…………………………………………………...44

Figure 5.18: Percentage accuracy for each surface……………………………………....45

Figure 5.19: Test site 1…………………………………………………………………...46

Figure 5.20: Test site 2………………………………………………………………..….48

Figure 5.21: Backpropagation neural network results………………………………..….50

Figure 5.22: Backpropagation training error……………………………………….……51

Page 12: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

xi

ABSTRACT

The semi-autonomous vehicle known as the Experimental Unmanned Vehicle (XUV)

was designed by the US Army to autonomously navigate over different types of terrain.

The performance of autonomous navigation improves when the vehicle’s control system

takes into account the type of terrain on which the vehicle is traveling. For example, if the

ground is covered with snow a reduction of acceleration is necessary to avoid wheel slip.

Previous researchers have developed algorithms based on vision and digital signal

processing (DSP) to categorize the traversability of the terrain. Others have used classical

terramechanics equations to identify the key terrain parameters. This thesis presents a

novel algorithm that uses the vehicle’s internal sensors to qualitatively categorize the

terrain type in real-time. The algorithm was successful in identifying gravel, packed dirt,

and grass.

Page 13: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

1

CHAPTER 1

INTRODUCTION

1.1 History of Mobile Robotics Research

Mobile robotics research has played a key role in the application of robots in our world.

At Stanford University Nils Nilsson [1] developed the mobile robot SHAKEY in 1969.

This robot possessed a visual range finder, a camera and binary tactile sensors. It was the

first mobile robot to use artificial intelligence to control its actions. Its main objective

was to navigate through highly structured environments such as office buildings. The JPL

Lunar rover [2], developed in the 1970s at the Jet Propulsion Laboratory, was designed

for planetary exploration. Using a TV camera, laser range finder and tactile sensors, the

robot categorized its environment as traversable, not traversable and unknown. In the late

1970s Hans Moravec [3] developed CART in the Artificial Intelligence laboratory at

Stanford. The robot was capable of following a white line on a road. A television camera

mounted on a rail on the top of CART took pictures from several different angles and

relayed them to a computer, which performed obstacle avoidance by gauging the distance

between CART and obstacles in its path. In 1994, the CMU Robotics Institute's Dante II

[4], a six-legged walking robot, explored the Mt. Spurr volcano in Alaska to sample

volcanic gases. In 1997 NASA's Mars Pathfinder delivered the Sojourner rover [5] to

Mars. Sojourner sent back images of its travels on the distant planet. Also in this same

year, Honda showcased the P3 [6], an extraordinary prototype in humanoid robotic

design. Currently NASA’s rovers SPIRIT and OPPORTUNITY [7] are exploring the

Martian soil for signs of water.

Page 14: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

2

1.2 The Significance of Mobile Robotics Research

Autonomous mobile robotics is a challenging research topic for several reasons. First, to

change a mobile robot from a computer on wheels able to sense some physical properties

of its environment into an intelligent machine, able to identify features, detect patterns,

learn from experience, build maps, and navigate requires the simultaneous application of

many research disciplines. Engineering and computer science are core disciplines of

mobile robotics. When questions of intelligent behavior arise, cognitive science,

psychology and to some extent philosophy also offer hypotheses and answers.

Secondly, autonomous mobile robots are the closest approximation yet of intelligent

agents. For centuries people have been interested in building machines that can think and

make decisions based on the environment around them. To satisfy this goal mobile

robotics research has increasingly incorporated artificial intelligence enabling the

machines to mimic living beings.

Thirdly, there are many applications for mobile robots. Transportation, surveillance,

inspection, cleaning and entertainment are just some examples. Mobile robots have been

used for outdoor security applications like the PatrolBot [8] by ActiveMedia. There are

robotic vacuum cleaners like ROOMBA [9] by iRobot and robotic animals like Sony’s

Aibo [10]. There are several mobile robotics applications for environments that are

inaccessible or hostile to humans. For example mobile robots are used for underwater

exploration, bomb disposal, and cleanup of contaminated environments. Finally, there are

military applications for mobile robots. Packbot [11] by iRobot, has been used in the

2001 war in Afghanistan. DARPA’s Micro Air Vehicles (MAVs) [12] and General

Dynamics Sentinel Robots [13] have been made for the US Army for deployment on the

battlefield. Before the infantry moves in, these robots can scan the area for information

regarding the terrain, enemy camp positions, and weather conditions.

Page 15: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

3

1.3 Research Objective

The Experimental Unmanned Vehicle (XUV) [14], shown in Figure 1.1, was developed

by General Dynamics Robotic Systems for the US Army. It is a semi-autonomous

unmanned ground vehicle (UGV) that uses high fidelity sensors for reconnaissance,

surveillance, and target acquisition. It is an all terrain vehicle with a payload of 2500 lbs,

capable of road following at 40 mph and obstacle avoidance. The XUV has a sensor suite

comprised of wheel encoders, an inertial reference unit (IRU) for navigation, GPS, a laser

range finder, mono and stereo cameras, wireless LAN antenna, and PLGR remote

antenna. The current goal of XUV research is to develop autonomous mobility that

enables an UGV to maneuver over rugged terrain as part of a mixed manned and

unmanned vehicle group. As part of this goal, the XUV must be able to maneuver at

speeds higher than traditional UGVs.

Figure 1.1: The XUV

For the XUV to maneuver at high speeds along with manned vehicles, it would be helpful

for the control system to know the type of terrain it was traversing. For example if the

XUV is traveling over ice, the controller should moderate acceleration.

Page 16: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

4

1.4 Justifying Terrain Detection

Once the XUV’s control system has knowledge of the type of surface it is on, it will be

easier to maneuver around obstacles and over uneven terrain. Also knowledge of the

terrain will allow the vehicle to be maneuvered at higher speeds. To emphasize that

terrain knowledge can improve the XUV’s mobility we now consider how specific terrain

types affect a manned vehicle.

1.4.1 Different Terrains Require Different Driving Techniques

When a vehicle travels over mud (Figure 1.2) there is a tendency for its wheels to get

stuck. To prevent immobilization the driver should drive slowly and resist high

acceleration that will cause wheel spin and “dig-into” the mud [15]. On sand, the best

way to maneuver is by steering smoothly with gear changes at high rpm. One should also

avoid soft sand at the base of dunes and gullies and make turns as wide as possible. When

ready to stop, avoid braking and coast down instead.

Figure 1.2: Vehicle on mud.

Page 17: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

5

In the case of a rocky terrain (Figure 1.3) slip occurs between the tires and the top layer

of rocks and between the rocks themselves. Here, sharp turns and quick accelerations,

may cause the vehicle to skid or even flip over [15].

Figure 1.3: Vehicle on rocky terrain.

Ice and snow (Figure 1.4) [15] drastically reduce the ability of tires to grip the road,

which means that slowing down, speeding up and changing direction all become

hazardous. Moderation is the trick to driving in these conditions. One must drive slowly,

allowing extra room to slow down and stop. To brake without locking the wheels, one

must shift into a low gear sooner, allow the speed to fall, and then use the brake pedal

gently. If a skid does develop, one should ease off the accelerator and turn into the skid.

Figure 1.4: Vehicle on snow.

Page 18: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

6

1.5 Thesis Objective

The goal of this thesis is to use a mobile robot’s internal sensors to correctly identify the

terrain it is on as gravel, dirt, and grass. Our mobile robot platform is the iRobot ATRV-

Jr. The vertical acceleration of the ATRV-Jr, measured by the inertial navigation sensor

(INS), is treated as the terrain signature. This signature is presented to a trained neural

network, which identifies the terrain as gravel, packed dirt, or grass using a pattern

classification method.

1.6 Chapter Summaries

This chapter gave a brief history and overview of current applications of mobile robotics.

It motivated the need for terrain identification on an autonomous ground vehicle. Chapter

2 describes past research done in this area and summarizes our research objective.

Chapter 3 explains why terrain identification is essentially a pattern classification

problem and the types of neural networks that are most suited for this purpose. Chapter 4

presents our terrain identification algorithm in detail. Chapter 5 contains the experimental

results of our algorithm applied to a test bed comprised of multiple terrains. Chapter 6

summarizes the contributions of this research and proposes future work.

Page 19: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

7

CHAPTER 2

LITERATURE REVIEW AND RESEARCH SUMMARY

A literature review led us to two sets of independent research on mobile robot terrain

characterization.

2.1 Vision Based Terrain Categorization and Traversability Assessment Index

Howard, Seraji, and Tunstel of the Jet Propulsion Laboratory (JPL) [16-19] used a rule-

based fuzzy traversability index to quantify the ease of travel of a terrain by a mobile

robot, which is based on real-time measurements of terrain characteristics retrieved from

image data.

In their algorithm the four key terrain characteristics are the terrain roughness, slope,

discontinuity, and hardness. Roughness indicates the coarseness and surface irregularity;

whereas, the slope is the average incline or decline of the ground surface. Discontinuity is

an indicator of terrain features such as cliffs, valleys, and ravines. Hardness is a measure

of the surface hardness that affects the traction of a mobile robot traversing difficult

terrain. The methodology incorporates an intuitive linguistic approach for expressing

terrain characteristics that are robust in terms of imprecision and uncertainty in the terrain

measurements. For example roughness is expressed as smooth, rough, or rocky and the

slope is classified as flat, sloped, or steep.

Page 20: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

8

The four terrain characteristics are combined using fuzzy rules to produce a traversability

index that quantifies the ease of travel over the terrain. This index too is expressed as a

linguistic fuzzy value, which categorizes the risk of traversability as high, medium and

low.

While the risk posed to the robot by the terrain is qualified, the JPL algorithm does not

identify the terrain type. Therefore, this algorithm gives more emphasis to obstacle

avoidance than terrain estimation. Also, since the initial terrain parameters are obtained

from vision sensors, changes in illumination may cause incorrect classifications.

2.2 Terrain Parameter Identification via Wheel Terrain Interaction Analysis

In 2002, at the Massachusetts Institute of Technology (MIT), Karl Iagnemma and Steven

Dubowsky [20-21] developed an algorithm for future planetary explorations where rovers

traverse very rough terrain with limited human supervision. They agree that terrain

estimation will increase the possibility of a successful mission by enabling a rover to

adapt its control and planning strategies.

The goal of the algorithm is to determine the soil shear strength from two key terrain

parameters: cohesion of the soil and internal friction angle. These parameters are

estimated online using a simplified form of classical terramechanics equations. Finally

using Coulomb’s equation, these two parameters can be combined to give the shear

strength of the soil. This algorithm has limitations related to sensor noise. Also, like the

JPL algorithm it does not explicitly determine the type of terrain.

Page 21: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

9

2.3 Research Summary

The objective of this thesis is to develop a terrain identification algorithm that relies on

data gathered by the vehicle’s internal sensors. Our test vehicle, the ATRV-Jr, has a

comprehensive sensor suite comprised of sonar, a laser range finder, two compasses, a

six-axis inertial navigation sensor (INS), global positioning system (GPS), and a Doppler

sensor. Our method of terrain identification can be called “sensing by feeling” because it

qualitatively determines the terrain by “touching” it with the robot’s internal sensors. The

algorithm is demonstrated on data gathered from experiments on iRobot’s ATRV-Jr. The

algorithm is written in C++ and included in the already existing iRobot Mobility suite,

comprised of Corba, Java, and C++ classes.

The vertical acceleration of the ATRV-Jr, which is representative of the vibration it

undergoes, is measured using the INS. A 1024-point fast Fourier transform of the

acceleration signal is treated as a terrain signature. Once obtained, the terrain signature is

fed to a neural network, which classifies it as gravel, packed dirt, or grass. Two different

types of neural networks were tested. One has been trained using back propagation [22].

The other, which is a probabilistic neural network, uses a one pass fast training [23]. The

performance of the two neural networks has been compared.

This terrain classification algorithm relies entirely on the vibration that the mobile robot

undergoes while traversing the terrain. Initially, the idea was to incorporate both wheel

slip and vibration measurements into the algorithm. But due to compatibility issues

between the data acquisition board for the wheel encoders and the robot’s operating

system, the wheel slip data was inaccessible. We decided to proceed with vibration data,

and plan to incorporate slip at a later time.

Page 22: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

10

CHAPTER 3

TERRAIN IDENTIFICATION USING NEURAL NETWORKS

3.1 Terrain Identification by Pattern Classification

Pattern classification is concerned with making decisions from complex patterns of data.

Generally predefined pattern classes are presented, and the task is to classify a future

pattern as one of the classes. The process is called classification or supervised pattern

recognition and can be accomplished using a neural network. Our goal is to use the

pattern classification ability of a neural network to perform terrain identification.

The neural network will first have to be trained on all the terrain signature patterns that it

should be able to identify. After the training has been completed, a signature will be

presented to the neural network which will classify it into one of the classes of terrain for

which the network was trained.

There are several types of neural network architectures and training techniques that can

be used to make a neural network capable of pattern identification. The most popular

neural networks for pattern identification are backpropagation networks and probabilistic

neural networks. A review of these networks is provided below.

3.2 Biological Neural Networks.

Artificial neural networks are composed of many simple elements operating in parallel.

These elements are inspired by the biological nervous system. In the brain specialized

cells called neurons perform information processing.

Page 23: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

11

The neuron (Figure 3.1) [24] is the fundamental functional unit of all nervous system

tissue. It is comprised of a soma, dendrites, axon and synapse. The soma is the cell body

that contains the cell nucleus. Dendrites are composed of a number of fibers and serve as

inputs to the neuron. The axon is a single long fiber with many branches that carry the

output from the neuron. The synapse is the junction of axon and dendrites. Typically each

neuron forms synapses with ten to a hundred thousand other neurons.

Figure 3.1: Schematic of a neuron in the human brain.

The brain works through electro-chemical reactions. Chemical substances are released

from the synapses and enter the dendrites, raising or lowering the electrical potential of

the cell body. When the potential reaches a fixed value called the threshold, an electrical

pulse or action potential is sent down the axon. The pulse spreads out along the branches

of the axon, eventually reaching synapses, and releases transmitters into the bodies of

other cells. Typically there are two types of synapses, excitory synapses that increase

potential and inhibitory synapses that decrease potential. Synaptic connections exhibit

long-term changes in the strengths of connections in response to the pattern of

stimulation. The strength changes are the basis of learning exhibited by the brain.

Page 24: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

12

3.3 Artificial Neural Networks

Artificial neural networks are composed of a number of nodes or units connected by

links. Table 3.1 [25] compares an artificial neural network with the human brain. Each

link has a numeric weight and bias associated with it. Weights and biases are primary

means of long-term storage in a neural network, and learning usually takes place by

updating them. Each node has a set of input links from other nodes, a set of output links

to other nodes, a current activation level and a means of computing the activation level at

the next time step given its input weights and biases.

Table 3.1: Comparison of the human brain with digital computers.

Computer Human brain

Computational units 1 CPU, 10^5 gates 10^11 neurons

Storage units 10^9 bits RAM, 10^10 bits

disk

10^11 neurons, 10^14

synapses

Cycle time 10^-8 seconds 10^-3 seconds

Bandwidth 10^9 bits/sec 10^14 bit/sec

Neuron updates/second 10^5 10^14

Page 25: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

13

3.3.1 Neurons

A simple artificial neuron is shown in Figure 3.2 [22]. A neuron with a single scalar

input and no bias appears on the left. The scalar input p is transmitted through a

connection that multiplies its strength by the scalar weight w, to form the product wp,

again a scalar. Here the weighted input wp is the only argument of the transfer function f;

this produces the scalar output a. The neuron on the right has a scalar bias b. The bias is

simply being added to the product wp as shown by the summing junction or as shifting

the function f to the left by an amount b.

Figure 3.2: Simple neuron model. The one on left is depicted without a bias.

The one on right has a bias.

The bias is like a weight, except that it has a constant input of 1. The transfer function net

input n, again a scalar, is the sum of the weighted input wp and the bias b. This sum is the

argument of the transfer function f. Note that w and b are both adjustable scalar

parameters of the neuron. The central idea of neural networks is that such parameters can

be adjusted so that the network exhibits some desired behavior. Thus, we can train the

network to perform a particular function by adjusting the weight or bias parameters. It is

also possible for the network to adjust itself through feedback.

Page 26: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

14

3.3.2 Transfer Functions

The neural network’s transfer function typically determines the task it can perform. Three

of the most commonly used functions are shown in Figure 3.3. The hard-limit transfer

function (Figure 3.3) limits the output of the neuron to either 0 or 1 [22]. The output is 0

if the net input argument n is less than 0, and 1 if n is greater than or equal to 0. The

linear transfer function (Figure 3.3) is used as linear approximators in linear filters [22].

The sigmoid transfer function (Figure 3.3) takes the input, which may have any value

between plus and minus infinity, and reduces the output to the range 0 to 1 [22]. This

transfer function is commonly used in back propagation networks, in part because it is

differentiable.

(a) (b)

(c)

Figure 3.3: Transfer functions (a) hard-limit (b) linear (c) log-sigmoid.

Page 27: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

15

3.4 Network Architectures

By interconnecting multiple neurons, the true computing power of a neural network can

be realized. The most common structure of connecting neurons is by layers. The simplest

form of a layered network is shown in Figure 3.4 [22]. The shaded nodes on the left are

the input layer. The input layer neurons pass and distribute the inputs but perform no

computation. Each of the inputs nxxxx ,...,,, 321 is connected to every neuron in the output

layer through weighted connections. Since every value of output nyyyy ,...,,, 321 is

calculated from the same set of input values, each output is a function of the connection

weights.

Figure 3.4: Single-layer feed forward networks depicting the inputs, input layer and

output layer.

Page 28: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

16

To achieve a higher level of computational capabilities, a more complex neural network

structure is required. Figure 3.5 shows the multilayer neural network, which distinguishes

itself from the single-layer network by having one or more intermediate or hidden layers

[22]. In this multilayer structure, the input nodes pass the information to the nodes in the

first hidden layer, then the outputs from the first hidden layer are passed to the next layer,

and so on. Multilayer networks can be viewed as cascading groups of single-layer

networks. The level of complexity in computing can be seen by the fact that many single-

layer networks are combined into this multilayer network.

Figure 3.5: Multi-layer feed forward networks.

Page 29: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

17

3.5 Network Training

Commonly neural networks are adjusted, or trained, so that a particular input leads to a

specific target output. An example of this training is shown below in Figure 3.6 [22].

Here the network is adjusted, based on a comparison of the output and the target, until the

network output matches the target. This is called supervised learning and in practice

many input/target pairs are used.

Figure 3.6: Training of a neural network.

3.5.1 Training by Backpropagation

Backpropagation is the generalization of the Widrow-Hoff [26] learning rule to multiple

layer networks and non-linear differentiable transfer functions. Typically a neural

network trained by backpropagation consists of biases, at least one sigmoid neuron layer

and a linear output layer. Such a network is capable of approximating any mathematical

function. The backpropagation learning rules are used to adjust the weights and biases of

networks so that the sum of squared error for the network is minimized. The error

minimization is accomplished by changing the values of the network weights and biases

continuously in the direction of steepest descent. This method called the gradient descent

procedure is described in more detail below.

Page 30: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

18

The Error vector (E) is defined as the difference between the target vector (T) and neuron

output (A)

E=T-A (3.1)

Derivatives of error (called delta vectors) are calculated from the network’s output layer

and then back propagated through the network until the delta vectors are available for

each hidden layer. The delta vectors for the output layer are calculated from the network

error vectors. The delta vectors for the hidden layers are calculated from the next layer’s

delta vectors. From this backpropagation of delta vectors the learning rule gets its name.

The rule for updating the weights and biases is given below.

)()(

)()(),(

ilrDiB

jPilrDjiW

=∆=∆

(3.2)

where W=weights, B=biases, D=Layer’s Delta vector, P=Layer’s Input Layer, and

lr=learning rate.

Training is composed of four phases. In the presentation phase, the input training vector

is fed into the neural network and the output vector is computed by the neural network. In

the check phase, the network error vector and the sum of squared error for the input

vector are calculated. Training is discontinued if the sum of the squared errors is less than

the error goal or if the specified maximum number of cycles or epochs has been reached.

Otherwise the neural network moves into the backpropagation phase. Here, the delta

vector for the output layer is calculated using the target vector. Then the delta vectors are

back propagated to the previous layers. Finally, in the learning phase each layers new

weight matrix and new bias are computed. The neural network then returns to the

presentation phase and the steps are repeated.

The backpropagation training method was incorporated in Matlab. The network was

comprised of 2 layers. The first layer was the hidden layer comprised of sixty sigmoid

neurons. The output layer was comprised of linear neurons. The error goal was kept at .01

and the maximum number of epochs up to which training was continued was 1000. The

Page 31: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

19

Matlab function used for the training was “traingdx” which uses both momentum and

adaptive learning rate so that the network is less likely to get stuck in local minima. The

Matlab code for the neural network trained by backpropagation has been provided in

APPENDIX E.

3.5.2 Probabilistic Neural Network

Probabilistic neural network is a special type of neural network, which is generally used

for the purpose of pattern classification. The overview and theory behind this type of

neural network are given in Sections 3.5.2.1 and 3.5.2.2 respectively.

3.5.2.1 Overview of the Probabilistic Neural Network (PNN)

When a PNN is used as a pattern classifier, its goal is to classify an input pattern into one

of the predefined patterns that were previously presented and saved. The NN is first

presented with a sample of each pattern that it will be expected to recognize. After these

patterns are saved the PNN is presented with a pattern whose type is unknown. The PNN

calculates the probability that the unknown input pattern is of the same category as a

previously saved pattern. The probabilities are calculated according to the Gaussian

distributions of the input pattern centered about the previously saved patterns. The pattern

having the highest probability is selected as the category to which the input pattern

belongs. It is important to note that selecting a category based on probability is a

subjective task. There could be a high probability that an input pattern belongs to a saved

pattern even when the two would actually be considered dissimilar. For example,

consider one trying to classify a birdcall, having not saved a call pattern from that

particular bird. The probabilities that result may demonstrate how similar the input call is

to the call of other birds; however, it would be incorrect to classify that call as coming

from a bird who’s call generated the highest probability.

In the example shown in Figure 3.7 a PNN has been trained to recognize three different

patterns A, B and C. When the PNN is presented with the input pattern, the probability is

highest that it belongs to category B, so the input signal is classified as pattern B. The

theory behind this PNN classification example is given in Section 3.5.2.2.

Page 32: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

20

INPUT PATTERN

? ? ?

Pattern A Pattern B Pattern C

3.5.2.2 Probabilistic Neural Network

The probabilistic neural network is a special type of neural network generally used for

pattern classification. The function of any pattern classifier is to categorize each input

vector x into one of the predefined classes ic , i=1,2,…,n where n is the number of possible

classes to which x may belong. The probabilistic neural network (PNN) is based on the

Bayesian classifier. This classifier states that the probability of x being in class ic is

given by

∑=

=n

j

jj

ii

i

cPcxP

cPcxPxcP

1

)()/(

)()/()/( (3.3)

where )/( icxP is the conditional probability density function of x given set ic , )( jcP is

the probability of drawing data from another class jc . Vector x belongs to class ic if

)/( xcP i > )/( xcP j , ∀ j =1,2,…,n, j≠ i

Page 33: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

21

The Bayesian classifier assumes that the probability density function of the population

from which data is taken is known a priori. This assumption is the major limitation of

implementing the Bayesian classifier. The PNN eliminates this limitation by using a

training set that provides the desired statistical information for implementing the

Bayesian classifier. The desired probability density function (pdf) of the class is

approximated by the PNN. In particular the PNN approximates the probability that vector

x belongs to a particular class ic (i.e. it estimates the likelihood of an input feature pattern

being part of a learned category) as a sum of weighted Gaussian distributions centered at

each training sample given by,

∑=

−−−=

it

i

n

j

i

j

Ti

j

t

N

Ni

xxxx

n

xcP1

2

2

]2

)()(exp[

)2(

1)/(

σσπ (3.4)

where i

jx is the j-th training vector for patterns in class i, σ is the smoothing operator, N

is the dimension of input vector, and it

n is the number of training patterns in class i. For

non-linear decision boundaries, the smoothing operator σ needs to be as small as

possible. The structure of the PNN is shown in Figure 3.8. It is comprised of a 4-layer

structure consisting of the input layer, the pattern layer, the summation layer and the

output layer.

Figure 3.8: PNN structure

Page 34: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

22

The input vector x is defined as T

pxxxxx ],...,,,[

321= . The input layer feeds the input

vector to the pattern layer. The pattern layer calculates the pdf (using equation 3.4)

corresponding to each training pattern. The summation layer computes the probability

)(xfi

that the input vector x is in each of the classes i represented by the patterns in the

pattern layer. The output layer picks the class that has the highest probability in the

summation layer. The input is then classified to belong to this layer.

One of the issues with a PNN is selecting the correct value for the smoothing operator.

The method of cross-validation is generally used for proper selection of this operator. In

practice the first smoothing value is picked randomly. If the resulting pattern

classification does not produce the desired results the smoothing value is modified

slightly and the neural network is re-tested. This procedure is continued until the PNN

produces the desired results in terms of performance and least number of errors.

The advantage of this network is that training is instantaneous. Hence it is preferred for

real time applications because, as soon as one pattern representing each category has been

observed, the network can begin to categorize all future input patterns, which are fed into

the PNN, into one of the pre-defined categories. A practical advantage of this network is

that unlike many other neural networks PNN operates completely in parallel without the

need for feedback from the individual neurons back to the inputs. The shape of the

decision boundaries can be made simple or complex by varying the smoothing parameter

σ . The decision surfaces approach the Bayes optimal, which is optimal with respect to

the miscalculation rate. It tolerates erroneous samples and works for sparse samples. The

Matlab code for the PNN algorithm is given in APPENDIX D. It implements the pdf

based on Gaussian distribution centered at each training sample and classifies any input

signal into one of the predefined patterns only if the pdf value is higher than 0.4.

Page 35: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

23

CHAPTER 4

PROPOSED ALGORITHM

4.1 Purpose of the Research

The purpose of this research is to develop an algorithm for detecting the terrain being

traversed by a mobile robot. While past algorithms employed vision to characterize the

terrain lying ahead, we shall use the robot’s vibration to identify the terrain that is

currently being covered.

This algorithm has several advantages. It is capable of real-time terrain identification on a

mobile robot using only the internal sensors that would be part of any basic sensor suite;

in our case we use the inertial navigation sensor (INS) and in the future the wheel

encoders. Typically algorithms based on perception sensors have several limitations.

Camera based algorithms are subject to illumination errors and are generally

computationally expensive. Algorithms based on laser range finders have no means to

quantify either slip or vibration. Our algorithm, on the other hand, was developed with

the intention of evaluating the terrain in terms of wheel slip and vibrations, which are 2

key factors in determining how a vehicle needs to be driven. These measurements of slip

and vibrations from the internal sensors can be used as inputs to a control algorithm.

Constraints on vibration and slip are important control objectives.

Page 36: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

24

4.2 Terrain Detection

Our terrain detection algorithm relies on the vertical axis (Z-axis) acceleration data

gathered by the vehicle’s INS.

Figure 4.1: Mobile robot ATRV-Jr

For a terrain like gravel, the vibration data will be of a highly fluctuating nature with a

large number of sharp peaks. In comparison, the vibrations experienced on a smooth

surface like packed dirt will be of a less fluctuating nature with a fewer number of

smaller peaks.

4.3 Algorithm Overview

Data is read from the INS at the rate of 100 Hertz and a fast Fourier transform (FFT) of

the data is performed to determine the frequency content of the signal. The frequency

information is the terrain signature, which is fed to the probabilistic neural network

(PNN). A schematic of the terrain detection process flow is shown in Figure 4.2.

Page 37: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

25

Figure 4.2: Terrain identification by PNN.

4.4 Development of the Algorithm

First the mobile robot was simulated over a simple terrain model comprised of several

bumps. It served the purpose of problem initialization. This simulation was done with the

objective of using vibration only for terrain identification and getting an idea of how the

FFT will look for terrains in general. Then we moved onto a more refined simulation

where the mobile robot was simulated over asphalt, grass, gravel and packed dirt. The

purpose was to check the feasibility of using wheel slip for identifying terrain. We then

went on to a simple real world experiment to reconfirm the crucial role played by

vibration and wheel slip in identifying the terrain. We proceeded to check the validity of

the INS readings. After confirming the INS readings we then moved into the final phase

of gathering experimental data of the ATRV-Jr on different terrains and validating our

algorithm.

Z-axis

vibration FFT PNN

Terrain

Classification

Page 38: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

26

4.4.1 Problem Initialization

This research began with a simple simulation in ADAMS, which is a dynamic simulation

program by MSC software [34]. A mobile robot (Pioneer II) (model shown in Figure 4.3)

was simulated at different speeds over a flat surface having several bumps. The goal was

to determine how bump heights and spacing affect the Z-axis accelerations of the model

driving over them. The bumps were triangular in shape. The height of the bumps was 20

mm. The distance between each bump was 10,000 mm. The acceleration data in the Z

direction was collected and the FFT of this data was taken.

Figure 4.3: Model of Pioneer II in ADAMS.

For the same surface when the mobile robot was simulated at different speeds, different

FFT graphs were produced. From these simulations we proposed that the FFT of

vibration data for a particular speed on one terrain will be similar. For the same speed, the

FFTs from different terrains will differ. Hence the FFT of vehicle acceleration can be

treated as a terrain signature. However the algorithm will be dependent on the vehicle

speed.

Page 39: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

27

4.4.2 ADAMS Simulation

We wanted to see how certain surfaces may induce different amounts of wheel slip

during vehicle acceleration. We used ADAMS to create a simulation where four surfaces

(asphalt, grass, dirt, and gravel) were characterized by specific coefficients of static and

kinetic friction. The friction coefficients for the four surfaces are shown in Table 4.1 [14].

Table 4.1: Values for friction coefficient on different surfaces.

Surface Coefficient of static friction Coefficient of kinetic friction

Asphalt 1.0 0.65

Grass 0.5 0.3

Dirt 0.6 0.4

Gravel 0.8 0.4

The wheel slip for the four different terrains was calculated using the linear velocity of the

center of the wheel and the angular speed of the wheel. The robot model was accelerated

from rest to a velocity of 1.25 m/sec in 1.2 seconds. The resulting wheel slip data was used

to calculate the longitudinal wheel slip. Shown in Figure 4.4 is a plot of the longitudinal

wheel slip versus time on the simulated surfaces.

Page 40: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

28

Figure 4.4: ADAMS/View simulation results for percentage wheel slip on different

terrains.

The mobile robot starts at a high acceleration from rest. Gradually, as the final speed is

achieved, the rate of linear acceleration decreases and hence the corresponding

percentage slip values decrease until they are near 0 at constant vehicle speed. But what

should be noted is that although the slip graphs start at near 100% and move towards 0%,

the wheel slip profile for the four different surfaces are different. The result implies that

mobile robot wheel slip can be used to distinguish one terrain from another. As noted

earlier, we were not able to include wheel slip measurements in our algorithm, but we

plan to add it at a later date.

Page 41: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

29

4.4.3 Road Test

A Sports Utility Vehicle (SUV) was driven over different terrains at low to moderate

speeds to manually confirm that slip and vibration can be used for terrain classification.

The observations on vibrations and noise were made by the passenger. He had his eyes

closed while making his observations. The driver judged the wheel slip. He made sharp

turns and made a note of how much the vehicle skidded on each surface. Then the driver

and passenger interchanged their positions and made the observations again. A synopsis

of the observations made is presented in Table 4.2. The test strengthened our belief that

slip and vibration can be used to distinguish one terrain from another.

Table 4.2: Observations made when a SUV was driven over different terrains.

Please note that we do not feed noise into the algorithm, as the XUV will be deployed in

the battlefield. There amidst the sound of guns, cannons and other explosions and

ambient noise the vibration sound recorded by the microphone will be extremely

difficult to discern or completely lost.

Intermittent Low Very high Mud

Distinct Considerable Considerable Gravel

Negligible Negligible Negligible Asphalt

Noise Vibration Slip Surface

Page 42: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

30

4.4.4 INS Check

Because our algorithm uses acceleration data from the INS, we conducted an experiment

to test the validity of that sensor’s readings. The robot was made to move in a circle of

radius 3 feet while the acceleration readings in the x and y directions were recorded. A

fourth-order Butterworth high pass filter was incorporated in Matlab/Simulink

(APPENDIX C) to filter out the low frequency data in the readings. The frequency

response of the filter is shown in Figure 4.5. The transfer function of the filter is

8155.7073.30839.7 234

4

++++ ssss

s.

Figure 4.5: Frequency response of butterworth filter.

Page 43: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

31

This filtering is necessary because the INS accelerometer is incapable of reading low

frequency data; hence this data is corrupted and unreliable. This filtered acceleration data

is then integrated twice using MATLAB/Simulink to get the corresponding

displacements. The displacements in the x and y directions are then plotted using Matlab.

A graph of the displacement is shown in Figure 4.6.

Figure 4.6: Graph obtained by integrating acceleration data twice to get displacement.

As can be observed from the displacement graph, the path is approximately circular. This

proves that the acceleration readings from the INS are correct. The graph is not a perfect

circle due to several reasons. One, skid steering causes the robot to move in small jerks

instead of a smooth curve. Two, the robot was run in a small space in our lab. The wheel

encoders and INS may loose accuracy for very small displacements. Had the robot traced

a circle of much larger radius, the displacement plot would have been smoother. In the

next chapter we discuss experimental results for the mobile robot ATRV-Jr driving over

different terrains.

Page 44: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

32

CHAPTER 5

EXPERIMENTAL RESULTS

5.1 Test Bed

A test bed (Figures 5.1-5.3) was built in a field at the college of Engineering to facilitate

the gathering of acceleration data on various terrains. The test bed is comprised of gravel,

packed dirt, sand and grass. The width of the test bed is 1.5 meters, and the length of each

stretch of terrain is about 10 meters.

Grass Sand Packed Dirt Gravel Grass

30 meters

Figure 5.1: Test bed schematic.

The robot can travel at a maximum speed of 1m/sec, which allows us to gather at least ten

seconds of data on each terrain type.

Page 45: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

33

Figure 5.2: Test bed. Figure 5.3: ATRV-Jr on gravel.

There are two methods by which the ATRV-Jr can be driven. One is by joystick, which

plugs into a port on the robot, and the other is by a computer program loaded into the

robot’s mobility suite mentioned in Section 2.3 of this thesis. It was not practical to use

the joystick for test bed experimentation because the attached cord would require the user

to run along with the robot. Loading a computer program into the mobility suite was

chosen though that required connecting to the robot by wireless Ethernet.

5.2 Wireless communication for the ATRV-Jr

The ATRV-Jr is provided with a standalone radio called a station adapter (SA) that plugs

into the robot’s Ethernet port. The SA communicates with a standalone radio called the

access point (AP) that is usually connected to a computer local area network (LAN). The

wireless communication has an indoor range of 150 meters and an outdoor range of 300

meters.

For outdoor operation, a laptop is used to send commands to the robot. Since the SA and

the AP communicate in a specific bandwidth (2.4 GHz), which is different from that of

other wireless networks, a compatible PCMCIA card was purchased for the laptop. The

PCMCIA card (model No SA-PCD Pro.11) is manufactured by Alvarion and follows the

IEEE 8.2.11 series protocol. During test bed experiments, the card transmits commands

wirelessly to the AP, which wirelessly transmits the signal to the SA that is plugged into

the robot. A schematic of the set up is shown in the Figure 5.4.

Page 46: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

34

Since the AP requires AC power which is not available near the test bed, a portable 375-

watt inverter was used to convert DC (battery) power from a car lighter into 110 V AC.

The inverter is a TRIPPLITE model PV375.

Figure 5.4: Wireless connectivity outdoors (no access to LAN).

For indoor operations, the user types commands on a computer that is connected to the

same LAN as AP. The command is routed to the AP, which transmits it wirelessly to the

SA. The AP should be elevated from the ATRV-Jr running surface by about a meter and

should have a clear line of sight to the ATRV-Jr. Figure 5.5 is a schematic of the indoor

setup.

Figure 5.5: Wireless connectivity setup in the lab.

AP

LAN

User

SA

Laptop with

wireless card AP SA

Portable

power

source

ATRV-Jr

Ethernet

port

ATRV-Jr

Ethernet

port

Page 47: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

35

5.3 Mobility Programming

The robot travels over the test track at various speeds while the INS acceleration data is

collected using a C++ program (APPENDIX A). The program has been incorporated into

the existing mobility suite of programs. The suite runs on a Linux operating system and is

comprised of classes in C++, JAVA, and CORBA. The Linux version is 6.2 and the

kernel version is 2.4.22. The program collects the acceleration data at a rate of 100 Hz.

After the data is collected, the program writes the data to a text file, which is later

analyzed in Matlab version 6.5.

5.4 Experimental Procedure

The ATRV-Jr traveled over gravel, packed dirt and grass at speeds of 0.2 m/sec, 0.4

m/sec, 0.6 m/sec, and 0.8 m/sec. The FFT was taken of the acceleration data of 10

seconds that was measured at a frequency of 100 Hz. The acceleration vectors had 1000

elements. Since the number of points (N) in the FFT had to be equal to or greater than the

number of data points, we chose N = 1024. Also because the sampling frequency ( sf )

was 100 Hz, the signal’s spectrum was entirely below the Nyquist Frequency ( 2/sf ) or

50 Hz. After the FFTs were taken we proceeded to pattern classification.

The acceleration data had to be classified as gravel, packed dirt, or grass using a trained

probabilistic neural network (PNN). There was a minimum of twelve training vectors for

each surface since the network needed to classify three terrains at four different speeds.

We found that using two vectors per surface per speed (i.e. 24 vectors) provided good

results. It was noted in Section 3.5.2.2 that the smoothing parameter is adjusted to

improve the classification results in the case of the PNN. We found that a smoothing

factor of 0.1 produces the best results for terrain classification using the PNN.

Page 48: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

36

5.5 Experimental Results

The resulting Z-axis acceleration data for the three terrain types is shown in Figures 5.6-

5.8 below. While these plots differ somewhat in magnitude, they appear quite

homogeneous.

Figure 5.6: Plot of Z-axis acceleration on gravel.

Figure 5.7: Plot of Z-axis acceleration on grass.

Page 49: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

37

Figure 5.8: Plot of Z-axis acceleration on packed dirt.

The FFTs of the three sets of terrains have similar trends for the entire frequency

range of 0-50 Hz. The Matlab code for generating the FFTs is given in APPENDIX

B. Shown below in Figure 5.9 are the FFTs for the three different terrains.

(a) (b) (c)

Figure 5.9: FFTS of all three terrains

(a) gravel, (b) packed dirt, (c) grass

Page 50: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

38

The low frequency data in the FFT analysis is corrupted and hence ignored. There

exist sufficient differences in the FFTs in the frequency range of 10-20 Hz for the

three terrains to be distinguished from each other. Also it is desired that the patterns

to be identified by the PNN be of a reasonable length, as very long pattern vectors

will make the algorithm computationally intensive. Hence only the FFT data in the

frequency range 10-20 Hz was used for terrain identification. The FFTs in this range

for the three terrains are shown in Figures 5.10-5.12.

Figure 5.10 shows the FFT for data recorded while the robot was traveling over

gravel at speeds of 0.8, 0.6, 0.4 and 0.2 m/sec. Note that the maximum magnitude

occurs in the frequency range 10-12 Hz. The frequency data is generally distributed in

three lumps. These characteristics were specific to gravel.

(a) (b)

(c) (d)

Figure 5.10: FFT of gravel acceleration data.

(a) 0.8 m/sec, (b) 0.6 m/sec, (c) 0.4 m/sec, (d) 0.2 m/sec

Page 51: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

39

The FFT for packed dirt (Figure 5.11) at the same 4 speed levels is uniformly spread in

the frequency range of 10-20 Hz. The magnitude of this signal is small. There is a

considerable amount of fluctuation, which is represented by a large number of spikes in

the signal.

(a) (b)

(c) (d)

Figure 5.11: FFT of packed dirt acceleration data.

(a) 0.8 m/sec, (b) 0.6 m/sec, (c) 0.4 m/sec, (d) 0.2 m/sec

Page 52: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

40

The FFT for grass (Figure 5.12) at the same speed levels has a uniformly high magnitude

in the frequency range 10-20 Hz. This signal has comparatively less fluctuations in the

frequency content.

(a) (b)

(c) (d)

Figure 5.12: FFT of grass acceleration data.

(a) 0.8 m/sec, (b) 0.6 m/sec, (c) 0.4 m/sec, (d) 0.2 m/sec

Later we realized that the terrain signature, which we had assumed to be representative of

grass, is actually the signature of the undulating soil lying underneath the grass. The soil

below grass is very uneven with random slopes. This causes the ATRV-Jr to vibrate

randomly, which explains the uniform high magnitude over the entire frequency range of

Page 53: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

41

10-20 Hz. We believe that the future addition of wheel slip data to the algorithm will

enable us to correctly identify grass as opposed to the surface below the grass.

The FFT of the acceleration data for the frequency spectrum of 10-20 Hz is fed into the

probabilistic neural network (PNN). The PNN (APPENDIX D) does pattern classification

on the 55 element long FFT vector to classify terrain as gravel, packed dirt or grass. The

results of terrain classification have been shown in the form of bar charts.

The PNN correctly identified gravel. The probabilities of this classification was 71.21%

gravel, 3.47% packed dirt and 25.32% grass.

Figure 5.13: Bar plot for gravel classification.

The PNN also correctly identified packed dirt. The percentage probability of this

classification was 2.17% gravel, 74.97% packed dirt and 22.85% grass as shown in

Figure 5.14.

Figure 5.14: Bar plot for packed dirt classification.

Page 54: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

42

Figure 5.15 demonstrates that the PNN correctly identified grass. The classification

results obtained from the PNN is 20.17% gravel, 12.5% packed dirt, and 67.32% grass as

shown in Figure 5.14.

Figure 5.15: Bar plot for grass classification.

All the above results show the PNN classification for individual vectors of gravel, packed

dirt, and grass. The next set of results show batch classification by the PNN.

The PNN is first presented with a batch of 10 gravel FFT vectors. Figure 5.16 shows the

PNN’s classification probability of each individual vector as gravel, grass and packed

dirt. As can be seen in all the 10 cases, the PNN had the highest probability for gravel.

Hence gravel was identified correctly in all the 10 test cases. Similarly, Figures 5.17 and

5.18 represent the classification probabilities when the PNN is presented with 10 FFT

vectors of packed dirt and grass respectively. In each of the cases the PNN was able to

correctly identify the terrain.

Page 55: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

43

Figure 5.16: Results of a PNN classifying a set of ten gravel data expressed as a bar chart.

Figure 5.17: Results of a PNN classifying a set of packed dirt data expressed as a bar

chart.

Page 56: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

44

Figure 5.18: Results of a PNN classifying a set of grass data-bar chart.

The ATRV-Jr was driven over the three test bed terrains at speeds of 0.2, 0.4, 0.6 and 0.8

m/sec. The vehicle acceleration data was presented to the algorithm to produce terrain

classifications. The resulting PNN classifications are presented in Table 5.1.

Table 5.1: Error statistics on each surface for each speed level.

No Surface Speed (m/sec) Correct Wrong %Correct

1 Gravel 0.2 23 8 74.1

2 Gravel 0.4 27 6 81.8

3 Gravel 0.6 24 2 92.3

4 Gravel 0.8 26 1 96.2

5 Dirt 0.2 18 6 75

6 Dirt 0.4 17 5 77.27

7 Dirt 0.6 20 3 90.9

8 Dirt 0.8 25 2 92.59

9 Grass 0.2 16 5 76.1

10 Grass 0.4 30 7 81.08

11 Grass 0.6 21 2 91.3

12 Grass 0.8 20 1 95.2

Page 57: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

45

Please note that the total number of trials recorded for each terrain at each speed in Table

5.1 varies slightly due to the fact that some of the acceleration data collected had to be

ignored as the robot had just crossed over from one terrain over onto another. The graph

in Figure 5.18 shows the classification accuracy on gravel, packed dirt and grass at

various speeds.

Figure 5.19: Percentage accuracy for each surface at each speed level.

From the graph (Figure 5.18) it can be noted that the terrain classification accuracy is

better when the ATRV-Jr is traveling at moderate to high speeds than at rather low

speeds. When the mobile robot is traveling at a higher speed across the same terrain, it

generates a vibration signal of a higher magnitude. Every detail of the terrain is

represented well in the signal generated, which in turn leads to a more accurate terrain

signature. Hence this results in a higher accuracy for terrain classification when the

ATRV-Jr is traversing the terrain at moderately higher speeds.

Page 58: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

46

5.6 Verification of Results

We wanted to test the algorithm on data collected from surfaces at a location other than

the test bed. We decided to use two additional locations. One had an abundance of gravel;

the other contained gravel, grass, and dirt. We shall refer to the one with just gravel as

test site 1 and the other as test site 2.

5.6.1 Results from Test Site 1

The mobile robot was made to travel at different speeds on test site 1 (photo shown in

Figure 5.19). The classification results are recorded in Tables 5.2 and the percentage

classifications are indicated in the graphs in Figure 5.20.

Figure 5.20: Test site 1.

Table 5.2: Error statistics on test site 1.

No Surface Speed

(m/sec) Correct Wrong %Correct

1 Gravel 0.2 4 4 50

2 Gravel 0.6 7 1 87.5

3 Gravel 0.8 6 2 75

(a) (b)

Page 59: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

47

(c)

Figure 5.21: Classification percentage at various speed levels for test site 1

(a) 0.2 m/sec, (b) 0.6 m/sec, (c) 0.8 m/sec

The ATRV-Jr was made to travel over gravel in a straight line at speeds of 0.2, 0.6 and

0.8 m/sec. As can be seen from Table 5.2, the classification accuracy is lower at slower

speeds. For moderate to high speeds the accuracy increases considerably. Since it was not

possible to charge up the robot battery at the off-site testing sites, the time for which we

could run our experimentation on the ATRV-Jr was limited. Hence we limited our

experimentation on the ATRV-Jr to three different speeds levels at the test sites 1 and 2

as opposed to four levels on the test bed. Each experimental run gathered 10 seconds of

vertical acceleration data. The time spent on experimentation at each test site was about

45 minutes.

Page 60: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

48

5.6.2 Results from Test Site 2

A photo of the gravel portion at test site 2 is shown in Figure 5.21. As can be seen it is

possible to categorize this surface as gravel or packed dirt as the terrain seems to be

comprised of both. We expected the classification algorithm to assign the terrain as

equally gravel and packed dirt.

Figure 5.21: Test site 2

Table 5.3: Error statistics on test site 2

No Surface Speed

(m/sec)

Total

readings

Times

classified as

gravel

Times

classified as

packed dirt

1 Gravel 0.2 8 4 4

2 Gravel 0.6 8 4 4

3 Gravel 0.8 8 4 4

As shown in Table 5.3 our suspicion was born out when the PNN identified this mixed

terrain as gravel for 50% of the trials and as packed dirt for the remaining 50% of the

trials. It was never identified as grass. So the PNN classification is correct. The

classification probabilities are shown in the graphs in Figure 5.22.

Page 61: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

49

(a) (b)

(c)

Figure 5.22 Classification percentage at various speed levels for test site 2.

(a) 0.2 m/sec, (b) 0.6 m/sec, (c) 0.8 m/sec

The terrain signature was created using the vertical acceleration data of the ATRV-Jr as it

traversed the terrain in a straight line. It needs to be noted that the XUV when deployed

by the Army won’t always be traveling in a straight line as it maneuvers around the

obstacles. If it follows a curved path then a correction factor will have to be applied to the

vertical acceleration data, which will be a factor of the radius of curvature of the turn. In

the advent of the XUV traversing an incline or decline, again a correction factor will be

needed for the vertical acceleration which will be a function of the slope of incline or

decline. After the respective correction factors are applied the terrain signature of the

modified acceleration signal will represent the terrain accurately.

Page 62: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

50

5.7 Neural Network Classification by Backpropagation.

We also designed a neural network using the principle of Back Propagation. Network

training proceeded using simultaneously both momentum and adaptive learning rate. The

resulting network has a 2-layer structure. The hidden layer is made of sixty neurons that

use the sigmoid transfer function. The output layer is comprised of neurons that use the

linear transfer function. Since the training vectors were comprised of 55 elements,

training was tedious. It was trained primarily to recognize gravel at a speed of 0.6m/sec.

Figure 5.21 indicates the training results for the neural network. The training takes 303

epochs for the error to become negligible. The network error throughout the training is

shown in Figure 5.22. The training time was long, each session requiring about five

minutes. The time needed implies that this network cannot be trained in real time. Real

time training can become essential if the XUV is on a mission and has to learn a new

terrain type.

Figure 5.23: Backpropagation neural network results using both momentum and adaptive

learning rate.

Page 63: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

51

Figure 5.24: Backpropagation training error.

The resultant network was then tested for 5 sets of gravel data at 0.6 m/sec. The resulting

classifications are shown in table 5.4. The success rate varies between 60-80 percent.

Table 5.4: Error statistics using back propagation.

Trial

Number

Surface Speed

(m/sec)

Correct Wrong %Correct

1 Gravel 0.6 4 1 80

2 Gravel 0.6 4 1 80

3 Gravel 0.6 3 2 60

4 Gravel 0.6 3 2 60

5 Gravel 0.6 4 1 80

The miscalculation rate was higher in the case of the neural network trained by back-

propagation than that of PNN. Furthermore, if the backpropagation network is to be

trained to identify surfaces at all possible speeds, a huge number of training vectors will

be required and the number of neurons required in the hidden layer will increase

considerably.

Page 64: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

52

CHAPTER 6

CONCLUSIONS

This research emphasizes that if the XUV is to navigate over varied terrains at high

speeds then it is important for the control system to have knowledge of the terrains’

character. A neural network based terrain detection algorithm has been developed and

tested on the ATRV-Jr mobile robot.

Our algorithm uses a FFT of the robot’s z-axis acceleration data because that data has

significant information about the characteristics of the terrain. Our algorithm successfully

identified gravel, grass and packed dirt with a good degree of reliability for several

speeds.

The probabilistic neural network based on the Bayesian classifier proved to be ideal for

terrain classification. For moderate robot speeds, the PNN’s classification accuracy was

about 90% for gravel, grass, and packed dirt. It was also capable of identifying gravel and

dirt from sites other than the test bed. The PNN’s one pass trivial training technique is

both fast and efficient, which makes it suitable for real time applications.

The classification success rate of the PNN is higher than that of the back propagation

network. The PNN is also much easier to set up than the back propagation network where

selecting the number of neurons in each layer is tedious and involves a large number of

trials. Moreover, the back propagation training time is considerably higher and more

computationally intensive compared to the one pass training of the PNN.

Page 65: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

53

FUTURE WORK

The PNN was designed and used in Matlab. Therefore, at present this is an off line

classification method where data is collected from the robot and the PNN classification is

performed on a separate computer. It is suggested that the PNN be developed in the C

programming language so that the classification can be done on the robot in real time. We

see no reason why a real-time version of our algorithm cannot be realized. Also wheel

slip is an important factor to include in our classification algorithm. We believe that if we

incorporate both vibration and slip data into the algorithm, the terrain identification

algorithm will be more robust.

Page 66: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

54

APPENDIX A: C++ CODE TO COLLECT DATA FROM THE INS

#include "mobilitycomponents_i.h"

#include "mobilitydata_i.h"

#include "mobilitygeometry_i.h"

#include "mobilityactuator_i.h"

#include "mobilityutil.h"

#include <iostream.h>

#include <stdlib.h>

#include<fstream.h>

void print_usage();

int main (int argc, char **argv)

{

if (argc == 1) //if there's no args passed to the program then

//print usage and exit

{

print_usage();

return 0;

}

double vel_x[10000];

double vel_y[10000];

double vel_z[10000];

double acc_x[10000];

Page 67: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

55

double acc_y[10000];

double acc_z[10000];

//make some files to dump the DMU data to

//ofstream pos_out( "position.dat" );

ofstream dataformat_out( "dataformat.dat" );

ofstream data_out( "data.dat" );

//ofstream acc_out( "acceleration.dat" );

char pathName[255]; //buffer for object names

char *robotName; //Holds -robot command line option

// cout << "\r\n*************DMU 6 Axis Data

//Recorder*****************\r\n";

//look for robot name option. Exit if none specified.

robotName = mbyUtility::get_option(argc, argv, "-robot");

if(!robotName)

{

cout << "\r\nMust specify Robot Name!";

print_usage();

return -1;

}

//---------------Initialize Mobility, Connect to DMU

//server-----------------

mbyClientHelper *pHelper = new mbyClientHelper(argc,argv);

//required for c++ -> CORBA language mapping

//cout<<"\r\n Before getting Path\r\n";

//build pathname to DMU server

sprintf(pathName, "%s/crossbow/Drive/State", robotName);

//cout<<"\r\n After getting Path\r\n";

//cout << pathName;

Page 68: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

56

//this is a generic pointer that can point to any Mobility Object

CORBA::Object_ptr ptempObj = pHelper->find_object(pathName);

MobilityActuator::ActuatorState_var DMU_state;

MobilityActuator::ActuatorData_var DMU_data;

//downcast ptempObj to an ActuatorState object

DMU_state = MobilityActuator::ActuatorState::_narrow(ptempObj);

//This is a smart pointer to an object descriptor which automates

//memory management

MobilityCore::ObjectDescriptor_var pDescriptor;

//start dumping data

//cout << "\r\nGetting Data";

MobilityBase::TimeStampData ts;

cout << "\r\nTaking 1000 samples of data from the Crossbow....";

//cout << "\r\nCheck the following output files: position.dat,velocity.dat,

acceleration.dat....";

//cout << "\r\nhello1....";

//cout << "\r\nHelloHello1....";

for(int i=0; i<=1000; i++)

{

DMU_data = DMU_state->get_sample(0); //get latest data from

//DMU server

ts = DMU_data->timestamp; //get the current timestamp

vel_x[i]= DMU_data->velocity[0];

vel_y[i]= DMU_data->velocity[1];

vel_z[i]= DMU_data->velocity[2];

acc_x[i]= DMU_data->acceleration[0];

acc_y[i]= DMU_data->acceleration[1];

acc_z[i]= DMU_data->acceleration[2];

//////////////////CODE MODIFIED TO TAKE ONLY REQUIRED DATA INTO

//////TEXT FILES

//usleep(1);

Page 69: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

57

omni_thread::sleep(0,100);

}

cout<<"\r\n hello2" ;

cout<<"\r\n hello3" ;

dataformat_out<< "\r\n"

<< "Velocity X" << "\t"

<< "Velocity Y" << "\t"

<< "Velocity Z" << "\t"

<< "Acceleration X" << "\t"

<< "Acceleration Y" << "\t"

<< "Acceleration Z" ;

dataformat_out <<

"\r\n=========================================================

==============================================";

for(int i=0; i<=1000; i++)

{

data_out<< "\r\n"

<< vel_x[i] << "\t"

<< vel_y[i] << "\t"

<< vel_z[i] << "\t"

<< acc_x[i] << "\t"

<< acc_y[i] << "\t"

<< acc_z[i] ;

}

//pos_out.close();

data_out.close();

dataformat_out.close();

//acc_out.close();

cout<<"\r\n hello3" ;

}

void print_usage()

Page 70: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

58

{

cout <<

"-----------------------------------------------------------------\r\n"

<< "DMU6X_SAMPLE - \r\n"

<< "This program records data from a Crossbow 6 Axis Inertial

Nav system.\r\n"

<< "It is intended for use with iRobot Research Robots running

Mobility(tm)."

<< "\r\n\tUsage:\r\n"

<< "\t\tdmu_sample -robot <robotname> \r\n"

<< "\r\n\t\tOPTIONS:\r\n"

<< "\t\t\t-robot <robotname> name of robot to connect

to\r\n"

<<

"-----------------------------------------------------------------\r\n";

}

Page 71: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

59

APPENDIX B: MATLAB CODE TO PLOT FFT ON DIFFERENT SURFACES

load gravel.dat;

load sand.dat;

load mud.dat;

%%%%%%%%%%%%%%%Acceleration in Z direction

a1=gravel(:,6);

a2=sand(:,6);

a3=mud(:,6);

%%%%%%%%%%%%%%%%%%%%Velocity in Z direction

v1=gravel(:,6);

v2=sand(:,6);

v3=mud(:,6);

t=0:0.1:10;

y1=abs(fft(a1,1024));

y2=abs(fft(a2,1024));

y3=abs(fft(a3,1024));

f=100*(0:511)/1024;

figure(1)

plot(t,a1)

title('Acceleration in Z direction of Gravel');

xlabel('Time ');

Page 72: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

60

ylabel('Acceleration in Z direction');

figure(2)

plot(t,a2)

title('Acceleration in Z direction of Sand');

xlabel('Time ');

ylabel('Acceleration in Z direction');

figure(3)

plot(t,a3)

title('Acceleration in Z direction of Mud');

xlabel('Time ');

ylabel('Acceleration in Z direction');

figure(4)

plot(t,v1)

title('Velocity in Z direction of Gravel');

xlabel('Time ');

ylabel('Velocity in Z direction');

figure(5)

plot(t,v2)

title('Velocity in Z direction of Sand');

xlabel('Time ');

ylabel('Velocity in Z direction');

figure(6)

plot(t,v3)

title('Velocity in Z direction of Mud');

xlabel('Time ');

ylabel('Velocity in Z direction');

figure(7)

plot(f,y1(1:512));

title('FFT of Gravel');

xlabel('Frequency ');

Page 73: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

61

ylabel('Magnitude');

figure(8)

plot(f,y2(1:512));

title('FFT of Sand');

xlabel('Frequency ');

ylabel('Magnitude');

figure(9)

plot(f,y3(1:512));

title('FFT of Mud');

xlabel('Frequency ');

ylabel('Magnitude');

Page 74: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

62

APPENDIX C: MATLAB/SIMULINK CODE FOR INS CHECK

[num,den]=butter(4,0.66,'high','s');

t=0:0.01:60;

load acceleration.dat;

load velocity.dat;

vx=velocity(:,1)-mean(velocity(:,1))*ones(size(velocity(:,1)));

vy=velocity(:,2)-mean(velocity(:,2))*ones(size(velocity(:,2)));

AccelerationIntegrationATRV.mdl;

Page 75: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

63

Figure: Simulink Block for integrating the acceleration data twice

Page 76: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

64

APPENDIX D: PNN CLASSIFIER

clear;

load gravel1.dat;

load gravel2.dat;

load gravel3.dat;

load gravel4.dat;

load gravel5.dat;

load gravel6.dat;

load gravel7.dat;

load gravel8.dat;

load gravel10.dat;

load gravel4f.dat;

load mud1.dat;

load mud2.dat;

load mud3.dat;

load mud4.dat;

load mud5.dat;

load mud6.dat;

load mud7.dat;

load mud8.dat;

load sand1.dat;

Page 77: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

65

load sand2.dat;

load sand3.dat;

load sand4.dat;

load sand5.dat;

load sand6.dat;

load sand7.dat;

load sand8.dat;

load gravelf.dat;

y1=abs(fft(gravel1(:,6),1024));

y1=y1(45:110);

y2=abs(fft(gravel2(:,6),1024));

y2=y2(45:110);

y3=abs(fft(gravel3(:,6),1024));

y3=y3(45:110);

y4=abs(fft(gravel4(:,6),1024));

y4=y4(45:110);

y5=abs(fft(mud1(:,6),1024));

y5=y5(45:110);

y6=abs(fft(mud2(:,6),1024));

y6=y6(45:110);

y7=abs(fft(mud3(:,6),1024));

y7=y7(45:110);

y8=abs(fft(mud4(:,6),1024));

y8=y8(45:110);

y10=abs(fft(sand1(:,6),1024));

y10=y10(45:110);

y11=abs(fft(sand2(:,6),1024));

y11=y11(45:110);

y12=abs(fft(sand3(:,6),1024));

y12=y12(45:110);

y13=abs(fft(sand4(:,6),1024));

Page 78: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

66

y13=y13(45:110);

y9=abs(fft(gravel8(:,6),1024));

y9=y9(45:110);

x=[y1';y2';y3';y4';y5';y6';y7';y8';y10';y11';y12';y13'];

y=[1 1 1 1 2 2 2 2 3 3 3 3 ]';

xtest=[y9'];

a=3;

classes=3;

[classes,prob]=pnn1(x,y,classes,xtest,a)

bar(prob);

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

function [class,w_dist]=pnn1(x,y,classes,xtest,a);

[patterns,inputs]=size(x);

[patterns,outputs]=size(y);

Sigma=[];

cases=zeros(classes,1); % Vector containing the number of cases

for i=1:classes

ind=find(y==i);

cases(i)=length(ind);

if i==1

starting=1;

else

starting=sum(cases(1:i-1))+1;

end

Xtr=x(starting:starting+cases(i)-1,:);

Sigma=[Sigma;std(Xtr)];

end

M=max(size(xtest));

Sigma=1/norm(Sigma);

Page 79: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

67

[y,ind]=sort(y);

x=x(ind,:);

ctr=1;

for i=1:classes

ex_sum=0;

for j=1:cases(i)

ex_sum=(exp((-(xtest-x(ctr,:))*(xtest-x(ctr,:))')/(2*Sigma^2)))+ex_sum;

%ex_sum=(exp((-(xtest-x(ctr,:))*(xtest-x(ctr,:))')/(2*3^2)))+ex_sum;

ctr=ctr+1;

end

ex_sum;

%w_dist(i)=ex_sum/(((2*pi)^(M/2))*(3^(M))*cases(i));

w_dist(i)=ex_sum/(((2*pi)^(M/2))*(Sigma^(M))*cases(i));

end

w_dist=w_dist/10^57;

class=find(w_dist==max(w_dist))

Page 80: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

68

APPENDIX E: BACK PROPAGATION CODE

% % P=[y1 y2 y3 y4];

% % T=[1 1 1 1;0 0 0;0 0 0];

% % disp_freq=20;

clc;

S1 = 30;

source=[y1 y2 y3 y4 y5 y6 y7 y8 y9 y10 y11 y12 y21 y22];

target=[1 1 1 1 1 1 1 1 1 1 2 2 3 3];

[R,Q] = size(source);

[S3,Q] = size(target);

S2=40;

P = source;

T=target;

%net = newff(minmax(P),[S1 S2],{'logsig' 'purelin'},'traingdx');

net = NEWFF(minmax(P),[S1 S2 S3],{'logsig' 'logsig'

'purelin'},'traingdx','learngdm','mse')

%NEWFF(PR,[S1 S2...SNl],{TF1 TF2...TFNl},BTF,BLF,PF) takes,

%NNT2FF(PR,{W1 W2 ...},{B1 B2 ...},{TF1 TF2 ...},BTF,BLR,PF)

%NEWFF(PR,[S1 S2...SNl],{TF1 TF2...TFNl},BTF,BLF,PF) takes,

Page 81: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

69

%PR - Rx2 matrix of min and max values for R input elements.

%Si - Size of ith layer, for Nl layers.

%TFi - Transfer function of ith layer, default = 'tansig'.

%BTF - Backprop network training function, default = 'trainlm'.

%BLF - Backprop weight/bias learning function, default = 'learngdm'.

%PF - Performance function, default = 'mse'.

%and returns an N layer feed-forward backprop network.

net.performFcn = 'sse';

net.trainParam.goal = .001;

net.trainParam.show = 20;

net.trainParam.epochs =1000;

net.trainParam.mc = 0.95;

[net,tr] = train(net,P,T);

figure(2)

plot(tr.epoch,tr.perf);

ylabel('Squared Error'); xlabel('Epoch');

p = [y31 y32 y33 y34 y35];

a=sim(net,p)

a = round(a)

% if (a==1)

% display('Terrain classification is GRAVEL');

% elseif(a==2)

% display('Terrain classification is DIRT');

% elseif(a==3)

% display('Terrain classification is GRASS');

% end

Page 82: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

70

REFERENCES

[1] N.J. Nilson, A Mobile Automation :An application of Artificial Intelligence

Techniques, Proc. IJCAI, Washington DC, 1969.

[2] A.M. Thompson, The Navigation System of the JPL Robot, Proceedings Fifth

IJCAI, Cambridge MA, 1977.

[3] Hans Moravec, Visual Mapping by a Robot Rover, Proc. 6th

IJCAI, Tokyo, 1979.

[4] http://iirobotics.com/webpages/robohistory.php

[5] Muirhead B.K., Mars Pathfinder flight system integration and test, Aerospace

Conference, 1997. Proceedings., IEEE , Volume: 4 , 1-8 Feb. 1997.

[6] Tanie, K, Humanoid robot and its application possibility, Multisensor Fusion and

Integration for Intelligent Systems, MFI2003. Proceedings of IEEE International

Conference on , 30 July-1 Aug. 2003.

[7] Aronson, Z.H., Lechler, T., Reilly, R.R., Shenhar, A.J., Project spirit-a strategic

concept, Management of Engineering and Technology, 2001. PICMET '01.

Portland International Conference on , Volume: 1 , 29 July-2 Aug. 2001.

[8] http://www.mobilerobots.com/patrolbot.html

[9] http://www.irobot.com/consumer/default.asp

[10] Hornby, G.S., Takamura, S, Yokono, J., Hanagata, O., Yamamoto, T., Fujita, M.,

Evolving robust gaits with AIBO, Robotics and Automation, 2000. Proceedings.

ICRA '00. IEEE International Conference on , Volume: 3 , 24-28 April 2000.

[11] http://www.army-technology.com/contractors/mines/i_robot/

[12] Huai-yu Wu, Dong Sun, Zhao-ying Zhou, Shen-shu Xiong, Xiao-hao Wang,

Micro air vehicle: architecture and implementation, Robotics and Automation,

Page 83: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

71

2003. Proceedings. ICRA '03. IEEE International Conference on, Volume:

1, 14-19 Sept. 2003.

[13] http://www.machinebrain.com/Fighting_Robots/Military_Robots/more2.html

[14] http://www.army-technology.com/contractors

[15] http://www.4x4road.com/index1.html

[16] H.Seraji and A.Howard, ”Behavior-based robot navigation on challenging terrain:

A fuzzy logic approach”, IEEE Transactions on Robotics and Automation, vol:

18, no 3, pp.308-321, 2002.

[17] A.Howard, H.Seraji, and E. Tunstel, “A rule-based fuzzy traversability index for

mobile robot navigation”, Proceedings of 2001 IEEE International conference on

Robotics and Automation, Seoul, South Korea, vol. 3,pp.3067-3071.

[18] E.Tunstel, A.Howard and H.Seraji, ”Fuzzy rule based reasoning for rover safety

and survivability”, Proceedings of 2001 IEEE International conference on

Robotics and Automation, Seoul, South Korea, vol. 2, pp. 1413-1420.

[19] A.Howard and H.Seraji, “Vision-Based Terrain Characterization and

Traversability Assessment”, Journal of Robotic Systems vol 18,no 10,pp.577-

587.2001.

[20] Karl Iagnemma and Steven Dubowsky, “Terrain estimation for high-speed rough-

terrain autonomous vehicle navigation”, Proceedings of SPIE Conference on

Unmanned Ground Vehicle Technology IV, pp. 256-266, 2002.

[21] Karl Iagnemma, Hassan Shibly and Steven Dubowsky, “On-line Terrain

Parameter Estimation for Planetary Rovers”, Proceedings of the 2002 IEEE

International Conference on Robotics and Automation, Washington DC, vol. 3,pp.

3142-3147.

[22] Neural Network Toolbox by MATH WORKS Inc.

[23] D.F.Specht, “Probabilistic Neural Networks for Classification, Mapping, or

Associative Memory.

[24] http://www.ptd.neu.edu/Neuroanatomy/Cyberclass/Histology/axons.htm

[25] http://www.acm.org/ubiquity/views/v4i37_jesan_lauro.html

[26] Simon Haykin, Neural Networks A comprehensive foundation, Prentice Hall Inc,

second edition 1999.

Page 84: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

72

[27] Daniel J. Parkka, Equation Directory for the Reconstructionist, Institute of Police

Technology and Management, Second edition, 1996.

[27] B.D.Ripley, “Pattern Recognition via Neural Networks”.

[28] Parzen E, “On estimation of a probability density function and mode”

[29] D.Sadhukhan and C. Moore, “Online Terrain Estimation using Internal Sensors”,

Proceedings of Florida Conference on Recent Advances in Robotics, (FCRAR)

May 2003, Boca Raton, Florida 33431.

[30] P.K.Patra, M.Nayak, S.K.Nayak and N.K.Gobbak, “Probabilistic Neural Network

for Pattern Classification”.

[31] J.Y.Wong, Theory Of Ground Vehicles, John Wiley & Sons, Third Edition, 2001.

[32] Ulrich Nehmzow, Mobile Robotics: A Practical Introduction, Springer, 2000.

[33] http://www.arl.army.mil/wmrd/Tech/ugv-both.pdf

[34] http://www.mscsoftware.com/

[35] http://www.darpa.mil/tto/mav/mav_auvsi.html

[36] http://www.machinebrain.com/Fighting_Robots/Military_Robots/more2.html

[37] http://www.4x4road.com/index1.html

Page 85: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

73

BIOGRAPHICAL SKETCH

Debangshu Sadhukhan

Department of Mechanical Engineering

FAMU-FSU College of Engineering

2525 Pottsdamer Street, Tallahassee, FL, 32310

EDUCATION

MS, Mechanical Engineering (Aug 2001 – Mar 2004)

Florida State University, Tallahassee, Florida, USA.

Bachelor of Engineering, Mechanical (Aug 1995-July 1999)

Regional Engineering College, Durgapur, India.

PROFESSIONAL EXPERIENCE

RESEARCH

Graduate Research Assistant, Department of Mechanical Engineering, Florida State

University, (08/2002 to present)

TEACHING

Teaching Assistant, Mechanical Systems I and II, Department of Mechanical

Engineering, Florida State University, (08/2001 to 08/2002)

Page 86: Florida State University Librariesdiginole.lib.fsu.edu/islandora/object/fsu:180355/...Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate

74

SOFTWARE EXPERIENCE:

Assistant System Engineer, Tata Consultancy Services, SEEPZ, Mumbai, India (Jul 99 -

May 01)

Worked as a software developer for 2 years. Responsiblities include System Analysis,

Module Design, Coding,Testing and Documentation.

PUBLICATIONS:

Debangshu Sadhukhan, Dr. Carl Moore, “Online Terrain Estimation Using Internal

Sensors “ Proceedings of Florida Conference on Recent Advances in Robotics(FCRAR),

May 2003, Boca Raton, Florida 33431.

COMPUTER SKILLS

Scientific Software:MATLAB 6.5 (Toolboxes: Control System, Fuzzy Logic, Neural

Networks, Symbolic Math, Optimisation, System Identification), SIMULINK, MAPLE,

Mathcad.

Computer Aided Modelling and Simulation Tools:ADAMS, Algor, ProE, AutoCad,

SOLIDWORKS, Working Model, Smart Sketch

Operating Systems :Windows 95/NT, Unix , Linux , MSDOS.

Languages :Object Oriented Programming (OOP) principles, C++, JAVA, C, Javascript,

HTML.

Database:Relational database design, SQL, PL/ SQL, Oracle 8i.