191
Sensor Fusion for Mobile Robots 31st July, 2006 Lars Valdemar Mogensen, s001684 Master’s thesis at ØrstedDTU, Automation 30 ECTS-point Supervisors Nils Axel Andersen Ole Ravn ØrstedDTU, Automation Technical University of Denmark DK-2800 Kongens Lyngby, Denmark

Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Embed Size (px)

Citation preview

Page 1: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Sensor Fusion for MobileRobots

31st July, 2006

Lars Valdemar Mogensen, s001684

Master’s thesis at Ørsted•DTU, Automation30 ECTS-point

Supervisors

Nils Axel AndersenOle Ravn

Ørsted•DTU, Automation

Technical University of DenmarkDK-2800 Kongens Lyngby, Denmark

Page 2: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1
Page 3: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Preface

This master’s thesis on Sensor Fusion for Mobile Robots has been con-ducted at the Technical University of Denmark (DTU) at Ørsted•DTU,Automation (IAU). The project was supervised by Associate Professor NilsAxel Andersen and Associate Professor Ole Ravn.

The project was carried out in the period from February 1st 2006 toJuly 31st 2006.

The work load is rated at 30 ECTS points.

31st July, 2006

Lars Valdemar Mogensen, s001684www.lvm.dk

Page 4: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1
Page 5: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Abstract

This thesis documents the development, implementation and testing of a realtime sensor fusion library, for improving the positioning of autonomous mo-bile robots. The library is intended as a module in an existing robot controlprogram, for controlling vehicles with different platforms. Both simulationsand real life tests are carried out to evaluate the work done.

Real life testes show that the standard GPS sensor and odometry is notaccurate enough to enable autonomous navigation on a normal road.

Keywords: Real time calculations, sensor fusion, GPS, odometry, ro-bot simulation, extended Kalman filter

Resume

Denne afhandling dokumenterer udvikling, implementering og test af etsandtids sensor fusion bibliotek, til at forbedre positionering af mobile ro-botter. Biblioteket er tænkt som et modul til et eksisterende robot kontrolprogram, til at styre køretøjer med forskellig platform. Bade simuleringerog virkelige test er udført, for at vurdere resultatet.

Test viser, at en standard GPS og odometri ikke er præcist nok, til atmuliggøre autonom navigation pa en normal vej.

Nøgleord: Sandtids beregninger, sensor fusion, GPS, odometri, robotsimulering, udvidet Kalman filter

Page 6: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1
Page 7: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Acknowledgment

I would like to thank my two supervisors Nils Axel Andersen and Ole Ravnfor their help and support. Their knowledge of the system implementation,has greatly reduced the time needed for reverse engineering. The staff atthe institute has been very helpful in listening and opening my eyes to othersolutions. A special thanks goes to Bertil Morelli, for his commitment tomaking the institute a relaxed and interesting place to study. The studentswith whom I have studied and discussed my project deserve thanks - I hopethe time spent has been as fruitful for you, as it has been for me. A specialthank goes to Anders Reske-Nielsen and Asbjørn Mejnertsen, for giving mean introduction to their work and sparring when changes were made in theirsoftware. Thank goes to Kristen Mogensen, Morten Laursen and NikolajSvensson for proof-reading and commenting this thesis.

vii

Page 8: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1
Page 9: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Contents

1 Introduction 1

1.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

1.2 Problem Statement . . . . . . . . . . . . . . . . . . . . . . . . 3

1.2.1 Focus of the Project . . . . . . . . . . . . . . . . . . . 3

1.3 Evaluation of Consequences . . . . . . . . . . . . . . . . . . . 3

1.4 Outline of the Report . . . . . . . . . . . . . . . . . . . . . . 4

2 System Description 7

2.1 Robot Control Program . . . . . . . . . . . . . . . . . . . . . 8

2.1.1 Features . . . . . . . . . . . . . . . . . . . . . . . . . . 8

2.2 Multi Platform Simulator . . . . . . . . . . . . . . . . . . . . 8

2.3 MMR . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

2.4 HAKO Tractor . . . . . . . . . . . . . . . . . . . . . . . . . . 9

2.5 SMR . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

2.6 KALMtool . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

3 Previous Research 13

3.1 Autonomous Navigation . . . . . . . . . . . . . . . . . . . . . 13

3.2 Sensor Fusion . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

3.2.1 Land based . . . . . . . . . . . . . . . . . . . . . . . . 14

3.2.2 Sea based . . . . . . . . . . . . . . . . . . . . . . . . . 14

3.2.3 Air based . . . . . . . . . . . . . . . . . . . . . . . . . 15

3.3 Other Methods . . . . . . . . . . . . . . . . . . . . . . . . . . 15

3.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

4 System design 17

4.1 Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

4.1.1 Configuration . . . . . . . . . . . . . . . . . . . . . . . 18

4.2 Requirement Specification . . . . . . . . . . . . . . . . . . . . 19

4.3 Priority . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

4.4 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . 20

4.4.1 Documentation . . . . . . . . . . . . . . . . . . . . . . 21

ix

Page 10: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

5 Models 23

5.1 Sensors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

5.1.1 GPS . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

5.1.2 Odometry . . . . . . . . . . . . . . . . . . . . . . . . . 27

5.1.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . 27

5.2 Vehicles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

5.2.1 Ackerman Steering . . . . . . . . . . . . . . . . . . . . 29

5.2.2 Differential Steering . . . . . . . . . . . . . . . . . . . 32

5.2.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . 35

6 Kalman Filter 37

6.1 Discrete Extended Kalman Filter . . . . . . . . . . . . . . . . 38

6.2 Choosing a Filter . . . . . . . . . . . . . . . . . . . . . . . . . 39

6.2.1 Linearized Models . . . . . . . . . . . . . . . . . . . . 40

6.3 Sample Time and Delay . . . . . . . . . . . . . . . . . . . . . 40

6.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41

7 Software 43

7.1 Main Guidelines . . . . . . . . . . . . . . . . . . . . . . . . . 43

7.2 Robot Control Program . . . . . . . . . . . . . . . . . . . . . 44

7.2.1 Structure . . . . . . . . . . . . . . . . . . . . . . . . . 44

7.2.2 Configuration . . . . . . . . . . . . . . . . . . . . . . . 45

7.2.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . 45

7.3 Kalman Filter . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

7.3.1 Matrix library . . . . . . . . . . . . . . . . . . . . . . 46

7.3.2 Configuration . . . . . . . . . . . . . . . . . . . . . . . 47

7.3.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . 48

7.4 Simulator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

7.4.1 Structure . . . . . . . . . . . . . . . . . . . . . . . . . 48

7.4.2 GPS Sensor . . . . . . . . . . . . . . . . . . . . . . . . 49

7.4.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . 50

7.5 GPS Client and Server . . . . . . . . . . . . . . . . . . . . . . 50

7.5.1 Structure . . . . . . . . . . . . . . . . . . . . . . . . . 50

7.5.2 Server . . . . . . . . . . . . . . . . . . . . . . . . . . . 51

7.5.3 Client . . . . . . . . . . . . . . . . . . . . . . . . . . . 51

7.5.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . 52

8 Test 55

8.1 HAKO . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56

8.2 MMR . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60

8.2.1 Simulation . . . . . . . . . . . . . . . . . . . . . . . . 60

8.2.2 Real Life Test . . . . . . . . . . . . . . . . . . . . . . . 64

8.2.3 Dyrehaven . . . . . . . . . . . . . . . . . . . . . . . . . 70

8.2.4 Filter Tuning . . . . . . . . . . . . . . . . . . . . . . . 72

x

Page 11: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

9 Discussion 75

9.1 Implementation and Structure . . . . . . . . . . . . . . . . . . 759.2 Documentation . . . . . . . . . . . . . . . . . . . . . . . . . . 769.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77

10 Conclusion 81

11 Future work 83

11.1 SMRdemo . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8311.2 Simulator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8311.3 Filters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8411.4 IAUmat . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84

11.5 MMR . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84

Nomenclature 85

Bibliography 89

List of Figures 93

List of Tables 97

A Contributions by this Project 99

B Bug List 101B.1 SMRdemo . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101B.2 IAUmat . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101B.3 Kalman Filter . . . . . . . . . . . . . . . . . . . . . . . . . . . 102

B.4 Simulator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102B.5 GPS client/server . . . . . . . . . . . . . . . . . . . . . . . . . 102

C Test Results 103C.1 HAKO Simulation Results . . . . . . . . . . . . . . . . . . . . 104C.2 MMR Real Life Results . . . . . . . . . . . . . . . . . . . . . 106

C.2.1 Parking lot . . . . . . . . . . . . . . . . . . . . . . . . 106

C.2.2 Dyrehaven . . . . . . . . . . . . . . . . . . . . . . . . . 110

D Simulator 117D.1 Vehicles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117

D.2 GpsMouse class . . . . . . . . . . . . . . . . . . . . . . . . . . 119D.3 Expansions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120

E SMRdemo 121E.1 Matrix Library Study . . . . . . . . . . . . . . . . . . . . . . 121E.2 Kalman Filter . . . . . . . . . . . . . . . . . . . . . . . . . . . 125E.3 Configuration of SMRdemo by XML . . . . . . . . . . . . . . 126

xi

Page 12: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

E.4 Operating the Kalman Filter . . . . . . . . . . . . . . . . . . 126

E.5 Dynamic Libraries for SMRdemo . . . . . . . . . . . . . . . . 127

E.6 SMRdemo Compile . . . . . . . . . . . . . . . . . . . . . . . . 127

E.7 Matrix Operation Implementation . . . . . . . . . . . . . . . 128

E.8 SMRdemo Flowcharts . . . . . . . . . . . . . . . . . . . . . . 136

F Ackerman Steered Vehicles 143

F.1 Modeling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143

F.1.1 Model Dependent Parameters . . . . . . . . . . . . . . 144

F.1.2 Discrete non Linear System Equations . . . . . . . . . 144

F.1.3 Linear System Matrices . . . . . . . . . . . . . . . . . 145

F.2 HAKO Commands . . . . . . . . . . . . . . . . . . . . . . . . 147

F.3 Driving the HAKO . . . . . . . . . . . . . . . . . . . . . . . . 147

F.4 Logging Data . . . . . . . . . . . . . . . . . . . . . . . . . . . 147

F.5 XML Configuration HAKO Tractor . . . . . . . . . . . . . . . 148

G Differentially Steered Vehicles 151

G.1 Modeling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151

G.1.1 Model Dependent Parameters . . . . . . . . . . . . . . 152

G.1.2 Discrete non Linear System Equations . . . . . . . . . 152

G.1.3 Linear System Matrices . . . . . . . . . . . . . . . . . 153

G.2 SMR Simulator . . . . . . . . . . . . . . . . . . . . . . . . . . 154

G.2.1 hostname demo odo calib.dat . . . . . . . . . . . . . . 154

G.2.2 hostname demo ir calib.dat . . . . . . . . . . . . . . . 154

G.2.3 hostname demo ls calib.dat . . . . . . . . . . . . . . . 156

G.3 XML Configuration . . . . . . . . . . . . . . . . . . . . . . . . 157

G.3.1 Small Mobile Robot . . . . . . . . . . . . . . . . . . . 157

G.3.2 Medium Mobile Robot . . . . . . . . . . . . . . . . . . 159

H GPS under Linux 163

H.1 Linux Distribution GPS Server . . . . . . . . . . . . . . . . . 163

H.2 In-house GPS Server . . . . . . . . . . . . . . . . . . . . . . . 164

H.3 UTMgpsd . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 164

H.3.1 Socket Server . . . . . . . . . . . . . . . . . . . . . . . 164

H.3.2 GPS Communication . . . . . . . . . . . . . . . . . . . 165

H.3.3 EGNOS . . . . . . . . . . . . . . . . . . . . . . . . . . 165

H.4 Libgps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165

H.5 Gpsclient . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165

H.6 Test Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165

H.7 Lat/Lon to UTM Algorithm . . . . . . . . . . . . . . . . . . . 169

I Kalmtool v.4.2 173

xii

Page 13: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

CONTENTS CONTENTS

J CD 175J.1 Contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 175

J.1.1 Thesis . . . . . . . . . . . . . . . . . . . . . . . . . . . 175J.1.2 Models . . . . . . . . . . . . . . . . . . . . . . . . . . 175J.1.3 Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176J.1.4 Source . . . . . . . . . . . . . . . . . . . . . . . . . . . 176

J.2 CD . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 177

xiii

Page 14: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1
Page 15: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Chapter 1

Introduction

With the introduction of more and more advanced vehicles, at least inthe sense of computer power and sensor use, the need for a better way ofcombining these sensory information grows.

In the recent years vehicles such as road vehicles (cars, lorries andbuses) and farming equipment (tractors and combines) have become fullof on-board computers and sensors. The reason for this use of expensiveequipment has been driven by the need to improve the profit and safety.The computerization of cars has lead to the development of systems likeABS and ESP, which use different sensors on wheels and engine to aidand protect the driver. When looking at the farming equipment the maininterest in recent years has been to get the tractors and combines to followa predefined track in the field. The present solutions steer the tractor alonga predefined path to relieve the driver and give him the opportunity tofocus on other tasks. The reason for doing this is less damage to crops andsoil when using the same path, and the economic benefit from giving theoperator better time for operating the implements1.

This use of technology has also been deployed in other fields like con-struction equipment, ships and planes, where the objective is to relieve theoperator from repetitive or tedious tasks and improve the efficiency.

When operating the above mentioned machinery, the human involveddoes a lot of complex sensory processing based on the sensor information athand and prior experiences. To aid the operators in their work, a systemto combine the different sensory information is the key to all of the abovementioned systems. One type of machinery where a better comprehensionof the surroundings is more important than usual, is in autonomous vehicles,where there is no human driver to do the combination of sensor information,or sensor fusion as it is typically called.

1Tool on the back or at the front of the tractor.

1

Page 16: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

1. Introduction 1.1. Background

1.1 Background

A lot of research has been done in the areas of robots, navigation and map-ping, planning and sensor deployment. The usual solutions found has beensmaller vehicles for indoor use due to the less complex surroundings, but inthe last years, the development of larger autonomous vehicles for out dooruse has been developed.

A good example of smaller vehicles are lawnmowers like Robomow fromFriendly Robotics and cleaning robots like the Karcher vacuum cleaner, seefigure 1.1.

The larger vehicles made autonomous are e.g. the competitors of TheDARPA Grand Challenge2 and automated straddle carriers on the port ofBrisbane Australia, see figure 1.2. Ørsted•DTU, Department of Automation(subsequently referred to as IAU) has been involved in a project with TheRoyal Veterinary and Agricultural University (subsequently referred to asKVL), about autonomous navigation of a tractor for agricultural purposes.The project was mainly focused around the navigation system and userinterface, but a solution to combining sensor information was also treated.

Figure 1.1: RoboCleanerfrom Karcher.

Figure 1.2: Straddle car-rier from the port of Bris-bane.

The three large robots mentioned here all rely on combining sensor infor-mation from different sensors, for a better representation of the position andheading of the robot. This is a research area which has been investigatedbefore at IAU, but a real time solution which is integrated with the IAUrobot control software is not available.

2DARPA Grand Challenge is a competition for autonomous vehicles which compete totraverse different terrain on a dessert course of 140miles.

2

Page 17: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

1. Introduction 1.2. Problem Statement

1.2 Problem Statement

This leads to the topic of this thesis - how does one combine the informationat hand, and implement it in a real life system. The objective of the projectis to implement a real time sensor fusion solution and use it for improvingthe positioning autonomous vehicles.

1.2.1 Focus of the Project

The sensor fusion solution is to be based on existing robot control softwareand be constructed as an external component, with as little changes aspossible in the original control software. To make the system as efficient forfuture development as possible, the solution should be able to run on bothrobots and on a simulator for testing and tuning purposes.Because the developed software is to become an integral part of the robotcontrol program, it is important to make the solution future-proof and suitedfor real time application. A significant part of this will be on-line and off-linedocumentation of the code, to make it more user-friendly.

1.3 Evaluation of Consequences

Having defined the topic of the thesis, evaluating the impact the workwill have is of great interest. The following aspects will be used in theevaluation of the work.

Positive aspects

• Improved benefit from the sensors already employed.

• Smaller demands on future sensors in order to meet specific designparameters.

• More redundancy in the positioning system, if sensors are chosen cor-rectly. This does however require a fault tolerant expansion to theproposed solution.

• Better performance of the over all system.

Negative aspects

• More expensive due to higher demands on processing power and designtime.

• Can be hard to implement and optimize, if the sensors or vehicle modelis hard to model accurately.

3

Page 18: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

1. Introduction 1.4. Outline of the Report

1.4 Outline of the Report

To get an idea of the content of the report, a short summary of the chaptersis given here.

Chapter 2 - System Description Introduction to the vehicles that arecurrently used at IAU for educational purposes. The software toolsthat are available and relevant for the project are described and eval-uated.

Chapter 3 - Previous Research Overview of the work done in this fieldof research and what can be used for development of a solution.

Chapter 4 - System Design Description of the considerations behindthe development of the sensor fusion platform.

Chapter 5 - Models Description of the models needed to make the sen-sor fusion possible. A brief description of different sensor types isundertaken to familiarize the reader with this topic. At the end thechosen sensors are briefly described, in order to derive a model. Odo-metric vehicle models are described, and the model imperfections arediscussed.

Chapter 6 - Kalman Filter Introduction to the extended Kalman filterused in the implementation. This is the theoretical treatment of thealgorithms implemented, with a description of the implementation spe-cific modifications.

Chapter 7 - Software Introduction to the developed programs and con-figuration file format. The structure of the different programs is dis-cussed along with example XML configuration files.

Chapter 8 - Test The test is carried out as a two stage process. First theperformance of the sensor fusion is evaluated in simulation, and at theend real life tests are performed, to compare the performance.

Chapter 9 - Discussion The test results and the lessons learned in theimplementation will be discussed. The chapter focus is on the sensorfusion library, but the structural choices for the implementation willalso be discussed.

Chapter 10 - Conclusion Summarizes the findings and provides perspec-tive to the problem statement.

Chapter 11 - Future Work Expansion of the sensor fusion tool and thesurrounding programs are treated. Suggestions to making the inter-facing and testing will be treated.

4

Page 19: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

1. Introduction 1.4. Outline of the Report

Chapter 5 and 6 are theoretical dealing with the models and sensor fusionalgorithm. The chapters are slightly decoupled from the design and imple-mentation chapters, but they are absolutely relevant for the implementation.

To help the reader a nomenclature, list of figures and list of tables hasbeen added. The nomenclature is split into three sections abbreviation,model and filter. The Abbreviation section holds the abbreviations used inthe report, and short explanations to terms used. Model and Filter sectionholds the constants and variables used in the corresponding chapters. A CDis attached to the back of the report containing source code, documentation,this thesis and all test data acquired through the project.

Development Environment

The system has been developed on the IAU client/server system, using onlyinstalled software. The solution has not been tested extensively elsewhere.

5

Page 20: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1
Page 21: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Chapter 2

System Description

The objective of this project is to make an additional tool for the robotsavailable at IAU, and to make it easy for others to use. This means that thetool must be able to interface with the existing robot control and simulationtools. To give the reader an idea of the existing tools, a short introductionwill be given here. The tools available are:

• Robot Control Program used at IAU. The program is a multi platformrobot control program.

• Multi Robot Simulator used for off line testing and filter optimizationpurposes.

• Kalman Filtering Toolbox, used for off-line simulations and imple-mented in MATLAB/Simulink. This tool will be referred to as KALM-tool.

• Real time matrix C-library. The library is developed at IAU, but hasbeen living a rather silent life.

• GPS client / server from a previous project. An investigation of thepossibility of reusing this will be made, see appendix H.

The vehicle platforms available to the project are in prioritized order:

• Medium Mobile Robot (MMR) for outdoor use, located on DTU.

• Tractor modified for automatic use, located on KVL. This vehicle willfrom now on be referred to as the HAKO tractor.

• Small Mobile Robot (SMR) for indoor experiments, located on DTU.

7

Page 22: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

2. System Description 2.1. Robot Control Program

2.1 Robot Control Program

The robot control program was originally designed for the SMR robots asa demonstration program to show students that operating the SMR’s couldbe made easy. The program was later expanded to include both mediumsize and large mobile robots like the MMR, GuideBot1 and the last additionis the HAKO tractor. The program makes it possible to configure andcontrol both differentially and Ackerman steered vehicles, see chapter 5 foran explanation on the difference between the two.

2.1.1 Features

The robot control program consists of methods to handle sensory input anddifferent algorithms to maneuver the vehicles. On top of this an interpreter isadded that interprets SMR-CL [Andersen & Ravn, 2004] commands. Thesecommands enables the easy programming of tactical maneuvers directly orstrategic decision making on a higher level, and then feed the informationto the program via a socket connection. Most sensory information can becollected from the platform via commands, which enables specialized appli-cations to be constructed on top. Maneuvering the vehicles is made simpleby a wide selection of different commands like virtual line2 following andodometric maneuvers. These commands are available for all robots, butspecialized features also exist. For the SMR, wall following and followinglines of tape on the floor are two examples.

2.2 Multi Platform Simulator

To do fast prototyping of sensor fusion algorithms, the IAU multi platformsimulator will come in handy. The original was written for the SMRs andused for Corporative Robots [Hansen & Monrad, 2005] but it has been ex-panded to support almost all the robots at IAU. Parallel to the work onthis thesis, two other students was working on improving and adding newfeatures to the simulator, see [Nielsen & Mejnertsen, 2006]. The work donehere was primarily based on adding the HAKO tractor to the simulator, butit has also undergone a lot of changes to make it more versatile and easierto configure.

1http://www.oersted.dtu.dk/English/research/au/ag/equip/Guidebot.aspx2Imaginary line in the robots own impression of the world.

8

Page 23: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

2. System Description 2.3. MMR

2.3 MMR

Figure 2.1: MediumMobile Robot.

The Medium Mobile Robot originally de-scribed in [Breitling & Nielsen, 2004] is anoutdoor robot, built after the differentialmotion principle. The vehicle is mainlyused for outdoor navigation and mappingpurposes, but can also be used indoor. Thevehicle deploys the following sensors.

• GPS

• Odometry3

• Laser Scanner

• Gyro

The current use of the GPS in the MMRrobot is for very coarse position verificationand it is not at all used for sensor fusion.At the moment the dirt road / off-roaddriving is done with a combination of gyro,odometry and machine vision. The gyro isfor the heading, the odometry is for the driven distance and the machinevision is for road following. The GPS measurements are not read directlyby the robot control program, so an addition is necessary to make thismeasurement available.

2.4 HAKO Tractor

Figure 2.2: Hako tractor.

The HAKO tractor described in[Nielsen & Mejnertsen, 2006, p. 5]is an agricultural research plat-form used for development of au-tonomous farming techniques. IAUis working with The Royal Vet-erinary and Agricultural University(KVL) on making a better controlsystem for the HAKO tractor. Thevehicle is only used for outside ex-periments due to the diesel engine

3Measurement of motion based on wheel sensors. More details in chapter 5.

9

Page 24: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

2. System Description 2.5. SMR

and size. Due to the vehicles size and security system it needs constantsupervision in its current form. The vehicle deploys the following sensorsand implements.

• RTK - GPS

• Odometry

• IMU

• Different implements can be mounted on the back e.g. weed removersand sowing machines.

In the HAKO tractor the high-precision RTK-GPS is used much more. TheRTK-GPS measurement is fused with the odometry to make a better posi-tion estimate. The robot control program gives all the sensory input avail-able on the tractor.

2.5 SMR

Figure 2.3: SmallMobile robot.

The Small Mobile Robot platform is madefor indoor experiments. The platform issmall and safe to use in the vicinity ofpeople, without large security precautions.This vehicle is mainly used for teaching thebasics of mobile robots, but several mod-ified SMR’s has been developed over theyears for various projects. The configura-tion of the SMR’s is not uniform, but twosensor input could be interesting for thisproject.

• Odometry

• Camera Positioning

2.6 KALMtool

The KALMTool toolbox [Sejerøe, 2004] is a nonlinear system parameter es-timation toolbox. It is at the moment implemented in MATLAB / Simulinkwhich does not comply with the idea of keeping the solution implementedin C and independent of other programs.This however does not mean that it cannot be used for testing and algo-rithm development. The toolbox has been used for mobile robot navigationsimulations in the past, see [Sejerøe, 2004, p. 48], but it has not been usedfor that recently or in the current version. The new version of the toolbox

10

Page 25: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

2. System Description 2.6. KALMtool

has been restructured, which means that the old simulations containing theprior robot navigation simulations does not work in the new environment.Since the toolbox has proved valuable in the past, testing and finding out ifit is possible to accommodate the rewriting and interfacing with the robotcontrol program will be performed. The KALMtool toolbox is interestingon long term, due to the advanced algorithms it provides, which includes:

• Stationary Kalman filter

• Kalman filter

• Extended Kalman filter

• Unscented Kalman filter

• Divided difference filter, first order

• Divided difference filter, second order

• Sequential filter (projection Theorem)

• Sequential filter (Bayes Theorem)

A feasibility study has been carried out, discussing the possibility of usingthe toolbox directly in the robot control program [Mogensen, 2006], see page173. The result showed that the current version of the toolbox is interestingfor prototyping, but some fundamental changes in algorithms and structurewould be needed. On that account, the choice was made not to include theKALMtool toolbox in the project work.

11

Page 26: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1
Page 27: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Chapter 3

Previous Research

The main topic of the thesis is the development of a sensor fusion algorithmfor mobile robots. This leads to make a study of not only sensor fusion, butalso within autonomous mobile robots, where the solution is intended to beused. The reason for doing this is to get an idea of the challenges with thedifferent mobile robot solutions. In the following text external and internalresearch will be cited.

3.1 Autonomous Navigation

A large task in the development of autonomous vehicle solutions are the nav-igation of the vehicles. Navigation is a complex task, based on the sensoryimpression the vehicle has of its surroundings and decision making. Thismeans, that the actions of the vehicle cannot be made better or safer thanthe sensory input allows. This makes it interesting to improve the vehiclesimpression of the world around it.

Large tractor companies like John Deer and Fendt are currently shippingsolutions to help the driver keep predefined tracks in the field. The systemsare based primarily on RTK-GPS technology and not on sensor fusion fromwhat is visible in the literature. The use of sensor fusion does not seem veryvisible in the agricultural industry, but research is currently being done inthat field. With the demand for more efficient and cheaper agriculturalproducts the field of navigation, track following and field management isgrowing fast [Nielsen & Mejnertsen, 2006].

3.2 Sensor Fusion

To get an idea of the work that has been done in the field of positioning andsensor fusion a literature study was made. A lot of work has been done inthis field of research. One method is dominant when it comes to positioning

13

Page 28: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

3. Previous Research 3.2. Sensor Fusion

and sensor fusion - the Kalman filter or the extended Kalman filter approach[Bar-Shalom et al., 2001]. The concept of sensor fusion is widely used in alltypes of vehicles both on land, at sea and in the air.

3.2.1 Land based

One field where the position estimation has been researched is for agri-cultural purposes, where the better positioning has been used for bettercoverage algorithms and precision farming.1 Concentrating the focus on thesensor fusion, several different approaches has been made to solve the prob-lem. In [Hague et al., 2000] Kalman filtering is used to fuse relative sensoryinformation like machine vision, odometers, accelerometers and a compassto improve the position of an experimental platform. The amount of sen-sors gives a relatively good redundancy if one sensor tends to drift, and thechoice of local relative sensors and absolute measurement (compass) alsoadds to the stability. A more common approach is the use of an absolutepositioning. In [O’Connor et al., 1996] fusion of GPS and the steer angle isused to drive a tractor on a test field, with a high degree of accuracy. Thisis a very interesting study, since it is the same as the idea for this project.

In exploratory vehicles where autonomy is paramount, the positioningis a key parameter. One example studied is a volcano exploration vehicle[Caltabiano et al., 2004]. The sensor fusion is used for better positioning,but also for parameter estimation in the robot model. The robot is dif-ferentially steered and susceptible to model imperfections because of thelarge contact surface with the ground. Odometry and DGPS is used in thissolution, providing measurements used for on-line calibration and sensormanagement. IAU has also done research in this field before, used on bothindoor robots [Larsen, 1998] and outdoor robots [Blas & Riisgaard, 2005].

3.2.2 Sea based

A very common use of sensor fusion at sea is in position and heading estima-tion of submerged submarines. IAU has been involved in the development ofnavigation for underwater vehicles in a Ph.d. project [Larsen, 2001]. Sen-sor fusion was used to drastically improve the overall performance of thepositioning by fusing INS, Doppler radar velocity, GPS pseudo range andtarget sighting. The result is not directly applicable, but the types of sensorschosen are relevant.

1Precision farming is the use of positioning data to plan and differ the treatment ofdifferent areas of a field or the individual plants.

14

Page 29: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

3. Previous Research 3.3. Other Methods

3.2.3 Air based

Sensor fusion is widely used when working with aircrafts and especiallyautonomous ones. In [Sasiadek & Hartana, 2004] the fusion of GPS andINS using Kalman filtering, is done to get better positioning accuracy andhence improving the autonomous capabilities.

Since the second world war, a lot of work has been put into missile de-tection and estimation of trajectories. Sensor fusion is widely used to fuseseveral radar signals and other observations. In [Pathirana & Savkin, 2003]fusing radar and vision is described and shows a big improvement in preci-sion, while being more economic and available.

At IAU tests with autonomous flight has also been undertaken, and de-scribed [Holmgaard, 2004]. Sensor fusion was not deployed in this particularwork, but it is discussed as an interesting way of removing some of the mea-surement biases on the slower sensors, by fusing them with faster ones.

3.3 Other Methods

Kalman filtering is not the only way of doing sensor fusion. In recent yearsseveral other methods has seen the day, and particle filtering is one of them.Particle filtering is based on sequential Monte Carlo simulations and is amore statistical approach than the Kalman filter. Particle filtering is espe-cially superior to Kalman filtering due to more advanced noise models.The drawback of the particle filter is more complexity and higher com-putational demands. Only Kalman filtering is treated in this thesis, sincedevelopment of the basic real time solution is more important, than multiplefiltering algorithms.

3.4 Summary

Having studied the literature, it is obvious that positioning of mobile robotscan benefit from using sensor fusion. The algorithm variants are numerousand so are the fields where they have been used with success. The consis-tent use of extended Kalman filtering for sensor fusion, makes it an idealcandidate to implement, due to the high rate of success documented in theliterature. Not having a complete mathematical real time toolbox, the useof very complex and time consuming algorithms will not be investigatedfurther.

This report will describe the making of a tool for sensor fusion andimplementation of a Kalman filter. Two types of sensors will be used in thesolution, an absolute sensor and a relative sensor to support each other andhelp cancel out each others weaknesses2.

2Se chapter 5 for an introduction to sensors.

15

Page 30: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

3. Previous Research 3.4. Summary

The absolute sensor is a GPS sensor which provides a absolute positionwith bounded error. The reason for using the GPS sensor is, that it isavailable on two different platforms and with two different precisions, RTK-GPS and normal consumer GPS. The relative sensor used in the project isodometry, which can provide a relative position with unbounded error, whencombined with a model of the vehicle. Several of the other sensors availableare possible to use in the sensor fusion, but only two sensors will be usedactively in this project.

16

Page 31: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Chapter 4

System design

To make the sensor fusion library a successful and valuable addition to thecurrent robot control software, the current design has been investigated.This chapter is dedicated to the considerations that has gone into the de-sign of the solution. The main objective when integrating the sensor fusionmethods in the existing platform, has been to keep the solution as lean andbackwards compatible as possible.

The sensor fusion library is primarily supposed to be maintained anddeveloped by engineers. Configuration and daily use of specific filters aresupposed to be handled by other people, with a technical background. Inthe design phase this means, that daily use such as start/stop and parameteradjustments must be easily accessible, while the underlying algorithms arefor developers only.

The sensor fusion library is to be used for fusion of various sensors, butthe focus of this initial version is fusion of odometry and GPS signals. Themethod used is Kalman filtering, and the objective is to develop methodsfor both the Ackerman steered and differentially steered vehicles.

The development of the sensor fusion library is not to introduce majorchange in the current structure. The idea is to use existing solutions wherepossible, and not rewrite the existing software.

4.1 Structure

The structure of the robot control software is depicted in figure 4.1 includ-ing the robot simulator. Existing software is in black boxes, new software isin blue boxes and software where changes are made are in red boxes. Thetop level is the high level logic either as a program or as a script file in theSMR-CL language. The top level is not real time but handles the sequencein which the underlying software is supposed to handle and interact with theworld. No changes will be made in this part of the system. The second level

17

Page 32: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

4. System design 4.1. Structure

is the real time robot control program (SMRdemo), which handles commu-nication with the different robots. This is the program where the Kalmanfilter is to added to the structure of the software. SMRdemo is a programthat removes the complexity of different robot platforms and movements,by giving the higher control level commands like turn x degrees, move ymeters and return sensor reading from sensor z. The low levels arethe robot hardware servers and in parallel the robot simulator. Most sensorinformation is directly available from the existing robot hardware server likee.g. the odometry. The GPS however, is not available from the existingrobot hardware server, and a stable solution to that will have to be lookedinto.

Figure 4.1: Structure of the connections between the project compo-nents.

The design philosophy of the robot control software is to use small pro-grams and socket connections between them. This has the benefit thatvirtually all types of data formats can be used. Several solutions exist anddevelopment of a new standard will not add to the overall stability of theprogram structure, instead reusing an existing socket server along side thenormal will be attempted for the GPS server. An added benefit from usingthe existing solution is the relatively simple addition of the sensor in thesimulator.

4.1.1 Configuration

The configuration of the robot control software is in a transition from un-structured text files to eXtensible Markup Language (XML) structured textfiles. The transition from old style text files to XML files has the benefit ofgiving the tree like structure of the XML . This improves readability of thefiles and the comprehension of the data structure becomes easier to bothcomputer and End User. XML has the benefit of variable naming troughthe tag and attribute structure which is fully user configurable, see figure

18

Page 33: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

4. System design 4.2. Requirement Specification

4.2. XML supports comments in the files which is not normally possible innormal static configuration files.

<!-- Comment block -->

<tag

attribute="value1">

<nested_tag

nested_attribute="value2" />

</tag>

Figure 4.2: XML example showing tags, attribute and nested struc-ture.

XML does not only give the benefit of structure to the files. With XMLthe order in which the tags and attributes appear does not matter. Realexamples of XML files used for the programs can be found in chapter 7 -Implementation.

4.2 Requirement Specification

To clarify the work that needs to be done in order to achieve the goal ofsensor fusion, the tasks that need doing in the above depicted figure arepinned out here. The Kalman filter is to be included in the robot controlprogram as a module, and it is therefore considered an integral part of theprogram from here on.

Robot Control Program

• Two types of robots (differential and Ackerman) has to be supportedin the Kalman filtering library.

• Real time safe Kalman filtering algorithm should be implemented.

• The current Kalman filter used in the HAKO tractor[Nielsen & Mejnertsen, 2006] should be ported to the new filter-ing library, making it real time safe.

• SMRdemo should be restructured to support easy estimation and theconfiguration improved with XML.

Simulator

• Simulator tested especially with regard to the differentially steeredvehicles.

19

Page 34: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

4. System design 4.3. Priority

• Simulator expanded with a new sensor, in this case the GPS. Investi-gation of the existing communication library as a connection methodfor the GPS. This will simplify the implementation and testing needed.

GPS client/server

• Current GPS server solution evaluated, to see if it is sufficiently preciseand can be connected to the simulator.

• If the current server is not compatible, a new server will be made onexisting communication library, in order to keep the implementationas uniform as possible. Using an existing communication library willmake interfacing with the simulator easier.

4.3 Priority

Prioritizing the three major parts of the project is not easy since they de-pend on each other. Looking at figure 4.1 the only part of the projectthat can be singled out is the simulator. This is only when looking atthe structure, because this would mean running all test and debugging reallife. Real life testing is very time consuming and since the simulator is al-ready there, emphasis is put on being able to use it for testing. The use ofthe simulator is also greatly encouraged by [Hansen & Monrad, 2005] and[Nielsen & Mejnertsen, 2006].

4.4 Implementation

The implementation of the solution has some constraints, since it is to bean addition to the existing real time robot control software. Based on thedesign considerations set by IAU, the following criteria are important.

• Implementation i C-code, which makes maintenance simpler.

• Real time capable.

• Library based, to clutter the main robot control code as little as pos-sible.

• Compatible with the existing solutions. Functionality should not bechanged.

The robot control software consisting of simulator, robot control programand sensor servers is rather big, and adding modules to the solution callsfor easy debugging methods. Test programs will be constructed for thedifferent modules and hardware servers, in order to make the developmentand later maintenance as easy as possible.

20

Page 35: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

4. System design 4.4. Implementation

Part of documenting and working with the program complex alsomeans recording the errors encountered. Appendix B will be dedicated tocontain the critical problems recorded, whether they originate from bringingthe code in situations it is not accustomed to or compatibility problems. Abug list is also available in the documentation of the code on the CD.

4.4.1 Documentation

The method used for documenting the code is the Doxygen source doc-umentation suite, that allows specially formed comments in the code toextracted and presented. The way of presentation is optional spanningweb pages and man pages for on line documentation and Latex code andRTF documents for paper documentation. The major force of being ableto generate documentation from specially commented source files is adocumentation that is, in principle always up to date. Documentation isslightly more complicated, but as can be seen from figure 4.3, the notationusing tags takes little adaption for developers accustomed with Latex andHTML. If special type setting is needed, Doxygen supports embeddedLatex code for e.g. mathematical expressions.

/** \fn void mmul( matrix *ptm1, matrix *ptm2, matrix *ptm3 )

* Matrix multiplication ptm1 = ptm2 * ptm3

*

* \param[out] *ptm1 Pointer to result matrix

* \param[in] *ptm2 Pointer to first argument matrix

* \param[in] *ptm3 Pointer to second argument matrix

* \attention *ptm1 can not be equal to *ptm2

*/

Figure 4.3: Doxygen documentation example for the IAUmat realtime matrix library, matrix multiplication function.

To give an idea of the documentation format the function description ofthe documentation code in figure 4.3 is depicted in figure 4.4. The figureshows some of the advantages, namely hyper linked connections to defini-tions of structures used, links directly to a syntax highlighted source file,where the function is implemented and what variables and function calls areused. In addition to the function description shown here, the documenta-tion has comprehensive menus with easy access to files, functions, variables,global definitions and the option of a search engine that searches across allsource files documented.

Complete project documentation is available on the CD included in ap-

21

Page 36: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

4. System design 4.4. Implementation

Figure 4.4: Screen shot of the on line documentation generated fromthe Doxygen code depicted in figure 4.3.

pendix J. A shortcut list to the program documentation is available in theroot of the CD.

22

Page 37: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Chapter 5

Models

Modeling in this project consists of two parts, the sensors used and odomet-ric models of the vehicles. An introduction will be given to the sensors usedin the project before describing the models, because some of the parametersdescribing the sensors, are used to calculate the noise acting on the vehiclemodels.

5.1 Sensors

Whether indoor or outdoor, the need for positioning and safety is imperative.A necessary part of this is the ability to balance the use of sensors andavailable information, in order to get the best result. A short introductionto position sensors will be given here, along with the different properties ofthe chosen sensors for this project.

Absolute sensor Such as a GPS sensor or beacon system gives an absoluteposition in relation to an external coordinate system. The absolutesensor has it strengths in being able to give an absolute position witha bounded error. The down side to this type of sensor is normally arelatively poor precision and in some cases the sample time is slow incomparison to relative sensors.

Relative sensor Such as gyro and odometry gives a relative change in po-sition according to the internal coordinate system of the robot. Thestrong point is the high update rate and that it is totally self contained.The down side to this is usually drift of the calculated position, whichgrows without bounds, if it is not corrected by an absolute measure-ment periodically.

The above sensors are used for measurements of motion, but in the caseof vehicles there are also other sensors for the surroundings, like range mea-surement sensors and cameras. This kind of sensor also has some value when

23

Page 38: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

5. Models 5.1. Sensors

doing positioning, but a solution based on that will not be treated here. Foran introduction to that topic please refer to [Sejerøe, 2004]. For a more thor-ough introduction to sensors and robotics, refer to [Borenstein et al., 1996].

Faults

All sensors can fail in one way or the other. This can be the GPS signalfalling out permanently or the odometry sensor cable coming off. A totalfailure of one of the sensors will reduce performance drastically and possiblyrender the system useless. Periodic errors like GPS fallout and odometryerrors like wheel puncture can be handled to a degree, but again perfor-mance of the filter will suffer. Detection and handling of errors will not bethe treated in this report, but an introduction to the topic is available in[Blanke, 2003].

5.1.1 GPS

The Global Positioning System (GPS) is a system that uses satellites todetermine a position on earth. Like old maps were made more natural byusing trigonometry to correctly position well known fix points, the GPSsystem uses well known satellite positions and precise measurements ofthe distance from GPS receiver to the satellites, to calculate a positionon the surface of the earth. The GPS system is administered by theUnited States military which is both positive and negative. On thepositive side it is being maintained and renewed, but they have built inthe possibility to turn down the precision to 100 m. The technique usedis called Selective Availability. This option was switched off in year 2000,but is still possible to activate. This does not make the system uselesssince it does not look like it is going to be activated in the near future,but one should not depend solely on these measurements. The EuropeanGalileo project1 promises as good or better precision at the control ofthe European countries, when it is put into service by the year 2008.The introduction of the new European system will improve reliability andprecision, and at the same time give more control to the European countries.

The GPS system gives different possibilities when it comes to preci-sion of the position. One should think that just always choosing the bestprecision might be the best choice, but the price on these products doesnot allow just anybody to buy them. There are three main types of GPSon the consumer market, but the precision and price are very correlated2.

1European version of the GPS’s system, http://www.esa.int/esaNA/galileo.html2The prices on the equipment and services are taken from GEOTEAM A/S who dis-

tributes Trimble GPS equipment and edbpriser.dk who supplies prises on consumer elec-tronics.

24

Page 39: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

5. Models 5.1. Sensors

GPS The basic stand-alone version of the GPS sensor. The system is at themoment being improved by adding more information to the signals thesensor receives. This makes the precision better and is implementedin most of the new GPS-receivers. The new version of the system isknown as GPS WAAS in North America or GPS EGNOS in Europe.Price e100 for equipment.

DGPS DGPS works by having base stations on the ground which sends cor-rectional data to the GPS receiver. This is a more expensive solutionsince it demands an extra radio receiver and a fee for the correctionalsignal. Price e3500 for equipment + e650 a year.

RTK-GPS RTK-GPS is very expensive since it demands two GPS stationsand synchronization between the two. The RTK-GPS has by far thebest precision and speed, but the price makes it less desirable. Pricee30000 for equipment + e3500 a year.

The precision information available from the manufacturers is shown intable 5.1 The values mentioned here are the average precision errors (95%of the time). This infomation is valuable as a first guess for modelling theGPS reciever, but in reality the description of the precision is more complex,showing much smoother results when the reciever is mooving, than when thereceiver is stationary.

The sample time of the normal receivers are the same, but the RTK-GPS has a much lower sample time. This is a big advantage, since moreinformation effectively lowers the statistical variance of the RTK-GPS evenfurther than the improved precision alone.

There is an abundance of different reference frames used for naviga-tion and positioning purposes. The most common when working with GPSis the Latitude/Longitude reference frame3, but it is not very intuitive forengineers working with robots. The UTM coordinate system on the otherhand has the globe divided up into zones where the frame is a Cartesiancoordinate system, with meter units4. The UTM coordinate system is theone used currently at IAU, but the Lat/Lon to UTM converter has shownsome inconsistency with the real world. All of these precision measurementstalk about mean error, but as documented in [Nielsen & Mejnertsen, 2006]the RTK-GPS can drift under operation, displaying offsets in position asmuch as 20 − 30 cm observed during test runs over 10 hours. It is notknown what caused these errors that was observed by watching the tracksthe tractor made on the snowy ground. Looking in the log files did notreveal a positional drift, which is the main problem with using the GPS

3Lat/Lon is the reference frame known from sailing, the round globe and most postermaps showing the world.

4UTM is a variant similar to the maps used by e.g. the army.

25

Page 40: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

5. Models 5.1. Sensors

7 8 9 10 11 126

6.5

7

7.5

8

8.5

9

9.5

10

10.5

11

GPS test 24/04−06

World Easting [m] +720100 (UTM32)

Wor

ld N

orth

ing

[m]

+61

8751

0 (U

TM

32)

GPS wo. EGNOS

Figure 5.1: Test of GPS with EGNOS disabled.

system. It gives an absolute position, but it can apparently drift slightlyunder operation.

GPS Precision (95 %) Sample time

GPS w. SA 100 m 1 sGPS wo. SA 15 m 1 sDGPS wo. SA 3-5 m 1 sGPS WAAS / EGNOS wo. SA 3 m 1 sRTK-GPS 0.03 m 0.05 s

Table 5.1: GPS precision

Manually analyzing data logged from a real GPS in figure 5.1 shows,that the noise is a random walk, that changes within the specified errorinterval. This is not surprising since the position is deduced from movingsatellites. Modeling this is not possible, and hence the noise model chosenfor the GPS is Gaussian white noise, with a magnitude that can match thespecified error interval from table 5.1.

A simple model of the GPS with WAAS / EGNOS has to be implementedin the simulator software. The sensor characteristics is implemented as thetrue simulated position, to which a Gaussian white noise with the standarddeviations in table 5.2 is added. This simplification does not take any offsetof the measurement into account, but since no absolute ground truth isavailable when testing, the offset will not influence the performance of the

26

Page 41: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

5. Models 5.1. Sensors

sensor, as long as it varies sufficiently slow.

GPS σ

GPS wo. SA 5 mGPS WAAS / EGNOS wo. SA 1 m

Table 5.2: GPS precision in simulator.

5.1.2 Odometry

This form of sensor is based on encoders attached to the wheels or motor.By measuring how far the wheels have turned, it is possible to calculatehow far the vehicle has moved and how the heading has changed. If thiscalculation is done often enough, the small changes in distance and directioncan be summed up to give a representation of position and heading relativeto the starting point. The result of this method is highly dependent ofknowing the vehicle proportions precisely, since any error will be added tothe position, which will then drift. The result of a drifting position due toerrors can be seen on figure 5.2. The error imposed is a 0.57% larger wheeldiameter on both wheels of a differentially steered vehicle. The vehicle isfollowing a line on the floor, and is on the lower part of the figure drivingin its own tracks. From the figure the unbounded drift of the odometricposition is clearly visible. This means that if the vehicle keeps driving, theposition will get increasingly worse.

Like the GPS the sample time for the odometry varies for the tworobots. The MMR has a sample period of 0.01 s while the HAKO tractorhas a sample time of 0.1 s.

5.1.3 Summary

Based on the sensor features above and the sensors available on the MMRand HAKO tractor, the odometry and GPS are good candidates, to get abetter position estimate based on fusing the measurements from the twosensors. Performance cannot be evaluated at this moment, but the choiceof an absolute and a relative sensor is optimal. The absolute sensor choiceis interesting, because of the two different GPS sensors, which gives theopportunity to compare the difference in performance. The differences insample time is important for the implementation of the solution, both in therobot control software and in the simulator.

27

Page 42: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

5. Models 5.2. Vehicles

0 2 4 6 8 10 12−1

0

1

2

3

4

5

6

7

8

9

10SMR following line

Internal x−coordinate

Inte

rnal

y−

coor

dina

te

Drifting odometryTrack

Figure 5.2: Drifting odometry due to a 0.57% error in the odometriccalibration.

5.2 Vehicles

There are two vehicle structures which are interesting to work with in thisthesis. There is the Ackerman steering which is known from automobilesand most tractors, and there is the differential steering which is mostlyknown from wheelchairs, tanks and some really big tractors. The modelshave equal interest in this case, even though the differentially steeredplatform is the most accessible for testing.

Like the GPS system has an absolute reference frame in which the

Figure 5.3: Absolute and relative coordinate system for a vehicle.

28

Page 43: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

5. Models 5.2. Vehicles

positioning is being done, the vehicles have their own internal coordinatesystem. The convention is used to describe and track the movements ofthe vehicle relative to the starting position and heading. This is betterillustrated in figure 5.3. The main interest of this is the obvious offset inboth position and heading, that the used of sensor fusion should try tocompensate for, by using the absolute coordinate system to describe all ofthe information gathered.

5.2.1 Ackerman Steering

Not only the HAKO tractor uses Ackerman steering, another robotis the Ackerbot [Nielsen & Mejnertsen, 2004], which is a small indoorvehicle intended for indoor navigation. The Ackerman odometric modelhas been used before, but the precise formulation of the mathematicsis also verified through other literature such as [Tur et al., 2005] and[Dudek & Jenkin, 2000]. The model will be explained in discrete time inthe report, but both continuous time and discrete time model is availablein appendix F.

The odometric model is based on the geometry of the vehicle mo-tion and a simplification of the movement by the uni-cycle principle[Borenstein et al., 1996]. The main property of the Ackerman vehiclemovement in a circle is, that the center points on the front and rear axleare covering concentric arches, see figure 5.4. The three parameters thatmainly decide the odometric motion of the vehicle is L the distance betweenthe axles, φ the steering angle and dk driven distance. R is the radius ofthe vehicle movement.

Discrete time

x and y are the coordinates for the center point between the non steeringback wheels, θ is the heading of the vehicle and dk the driven distance ofthe vehicle. In the discrete case the variables has a subscript index thatdescribes the discrete time that the value belongs to.

The calculation from steer angle φk to change in vehicle heading δθk.

δθk =δdk

L· tan(φk) (5.1)

The system states are represented as xk and contains the vehicle odometricrepresentation as coordinates x, y and vehicle heading θ.

xk =

xyθ

k

(5.2)

29

Page 44: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

5. Models 5.2. Vehicles

x

yk

k

x∆

θ R

x k+1

yk+1 ∆ y

y

x

φ

L φ ∆θ

k

k

Figure 5.4: Ackerman model in discrete time.

The system control input is represented as uk and contains the covereddistance δdk and the steer angle φk.

uk =

[

δdk

φk

]

(5.3)

Combining all this into an odometric model of the vehicle gives the followingmodel.

xk+1 = xk + δdk · cos(θk +δθk

2) (5.4)

yk+1 = yk + δdk · sin(θk +δθk

2) (5.5)

θk+1 = θk + δθk (5.6)

The term δθk

2 is a compensation for the fact, that the vehicle does notdrive in the direction of the starting angle, nor does it drive in the directionof the finishing angle of the movement. The time update is therefore carriedout with the starting angle plus half the change in angle in that time update.The same goes for the differential steer model.

Model Noise

To get the Kalman filter to run as close to the optimum as possible, it isnecessary to have an idea of the size of the noise in the system. The reasonwhy this is necessary is, that the noise used in the design of the Kalman filteris assumed constant, and is not estimated on line by the filter. The valuescalculated in [Nielsen & Mejnertsen, 2006] will be used in the simulations,to make it possible to compare the simulated test runs. As described in[Nielsen & Mejnertsen, 2006, pp.32] and [Larsen, 1998, p.49] the calculation

30

Page 45: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

5. Models 5.2. Vehicles

of the noise is based more on the rule of thumb, that the noise has to bebig enough and tuning is done manually. Better noise assumptions can bemade from extensive tests with the actual vehicle, but small changes in thevehicle configuration will require in new tests.

State Noise

The state noise in the Ackerman steered vehicle is in this case the quantiza-tion on the steering angle sensor and on the odometry. Quantization errorscan be calculated, but errors from models can only be based on guessesand experimental work. The noise chosen also includes a compensation ofmodeling errors, since it is chosen relatively high.

Noise source σ

δθ 0.01 radδdk 0.02 m

Table 5.3: Input noise - Ackerman model.

Measurement noise

The measurement errors in the Ackerman steered vehicle is in this case theprecision on the RTK-GPS. The values used in [Nielsen & Mejnertsen, 2006]will be used in the simulations, to make it possible to compare the simulatedtest runs. Again the values has been chosen bigger than the theoretical,because the overall performance of the filter has been proven better withthis configuration.

Noise source σ

UTM easting 0.1 mUTM northing 0.1 m

Table 5.4: Measurement noise - Ackerman model.

31

Page 46: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

5. Models 5.2. Vehicles

5.2.2 Differential Steering

The differential steer model is interesting, as this is the steering methodfor almost all the AGV’s at IAU where this work is being made. An in-teresting part of the work is the opportunity to test the solution on IAUsMMR. The odometric model is well documented in internal reports like[Larsen, 1998], and the formulation of the mathematics has been verifiedagainst [Dudek & Jenkin, 2000]. Only the discrete version of the model willbe described here, but the continuous model is available in appendix G.

Discrete time

The discrete model of the differentially steered vehicle is depicted in figure5.5, to present the variables and geometry. The parameters that mainlydescribe the odometric motion of the vehicle are the driven distances of thetwo wheels and B the distance between them. The driven distance of thetwo wheels is proportional to the diameter of the wheel, represented by dwr

and dwl on figure 5.5.

x

k

k

x∆

∆θ

x k+1

yk+1 ∆ y

y

x

y

θ

Β

d wr

d wl

k

k

Figure 5.5: Differential model in discrete time.

In this vehicle, the system dependent parameters has already been usedin calculations by the robot control program and is therefore not used in thefilter equations.

The system states are represented as xk and contains the vehicle odo-metric representation as coordinates x, y and vehicle heading θ.

xk =

xyθ

k

(5.7)

The system control input is represented as uk and contains the covered

32

Page 47: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

5. Models 5.2. Vehicles

distance δdk and the change in rotation of the robot δθk.

uk =

[

δdk

δθk

]

(5.8)

Combining all this into an odometric model of the vehicle gives the followingmodel.

xk+1 = xk + δdk · cos(θk +δθk

2) (5.9)

yk+1 = yk + δdk · sin(θk +δθk

2) (5.10)

θk+1 = θk + δθk (5.11)

The differential model is almost the same as the Ackerman steered vehicle,but since the input vector to the model is not the same, the linear modeused in the filter algorithms and noise vector will not be entirely the same.

Model Noise

Like in the Ackerman steered vehicle there are two different kinds of noise inthe system - state noise and measurement noise. The input noise developsinternally in the vehicle from model imperfections and quantization errorsfrom the parameter deviations and encoders. The measurement noise comesfrom external sources and is in this case the noise on the GPS sensor usedto measure the absolute position.

Modeling state and measurement noise for a system is not trivial, soa very common approach is using gaussian white noise as a model, in thiscase the noise is further assumed to be uncorrelated. The problem is dealingwith the non systematic errors, occurring when driving the vehicle in the realworld. Gaussian white noise is a mathematical simple model, which modelsthe noise as normally distributed with a zero mean value. The second reasonwhy this approach is used, is the later use in Kalman filtering. Since thisfilter is all new the noise variance has to be calculated and evaluated. Thederivation of the noise covariance matrix is not done here, but a guidelinecan be found in [Larsen, 1998, p. 48].

State Noise

The state noise in the model is in this case on the wheel encoder readings,which goes to both the traveled distance δdk and the change in vehicleheading δθk.

In the equations below the dd subscript refers to the noise generated byone wheel encoder tick.

σ2dk

=σ2

dd

2(5.12)

33

Page 48: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

5. Models 5.2. Vehicles

σ2θk

=2σ2

dd

B2(5.13)

From statistics we known that 99.73% of the noise signals in gaussian whitenoise is within a ±3σ band of the mean value. This means that σdd can becalculated from the tick to meter ratio TTMR for the different differentiallysteered vehicles.

σdd =TTMR

3(5.14)

The results in the following table are based on a TTMR of 1.1505 · 10−4 mand a wheel distance B of 0.46 m. With that value the variances for theMMR quantization noise is.

σdd =1.1505 · 10−4 m

3= 3.835 · 10−4 m (5.15)

Noise input σ

δdk 2.7 · 10−5 mδθ 1.2 · 10−4 rad

Table 5.5: Input noise from quantization - differential model.

The noise calculated here is only generated by the quantization ofthe wheel encoders. This is the noise that can be calculated from thespecifications of the vehicle, but there are several other noise sources thatneeds to be contained in the above parameters. If the vehicle travels onuneven ground or there is wheel slippage those factors are higher, but theyare not possible to calculate directly. A final choice of input noise for thisvehicle must be based on tuning and experience. What is difficult here is,that as the conditions change, so does the optimal choice of parameters.

The calculation of the noise is based on an ideal model, but it isnever the case when working with real life implementations. In case ofmodel imperfections on e.g. wheel size it is necessary to choose an evenhigher variance to suppress that imperfection, if one does not model it. Thenoise values in tabel 5.5 will therefore not be used in real life, but valuesthat are higher, see table 5.6. What can be used from the above calculationis the relation between the sizes, which show that the variance on the anglehas a bigger value that the driven distance. As with the Ackerman steeredvehicle some experimental work can be done to get a better estimate of themodeling imperfections, but such a study has not been carried out as partof this project.

34

Page 49: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

5. Models 5.2. Vehicles

Noise input σ

δdk 1.0 · 10−3 mδθ 2.0 · 10−3 rad

Table 5.6: Input noise used in calculations - differential model.

Measurement noise

The measurement errors in the differentially steered vehicle is in this imple-mentation the GPS measurement errors from a consumer grade GPS. Thevariances for the measurement noise is listed in table 5.7.

Noise signal σ

UTM easting 3 mUTM northing 3 m

Table 5.7: Measurement noise - differential model.

5.2.3 Summary

Having modeled the vehicles and verified the result against the literature,the models and the noise assumptions can be used in the extended Kalmanfilter for sensor fusion use. The noise has been hard to model and that doesleave some tuning for the end user to carry out. Investigations of the noisesources in the vehicle would also benefit the filter result, but time has notpermitted an in depth test.The GPS specification does suggest some very good initial guesses for thevariances, but in reality, the ideal choice is most likely to be higher than theone specified in this chapter.The odometry noise is especially hard to model since the measurementsare susceptible to several non systematic errors. Above wheel slippage andrough terrain is mentioned, but other parameters like uneven wheel size formdifferent tire pressure or a leaning road can easily throw the odometry off.Like with the noise for the GPS, this means that in real life the varianceswill need to be bigger, to compensate for the model imperfections and theimposed noise.

35

Page 50: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1
Page 51: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Chapter 6

Kalman Filter

Kalman filtering was first introduced in 1960, and has since become a widelyused estimator for both linear and nonlinear systems. The basic filter isoften used as a first try on making a filtering algorithm, which can the becustomized into more advanced filter variants. The filter has some verygood properties like allowing measurements to arrive at different times andat different frequencies, and then being able to fuse them upon arrival. Thefilter supports system equations and measurements to be time varying andfor a linear system with Gaussian noise it delivers an optimal solution tothe problem at hand [Larsen, 1998]. If the system is nonlinear the filter willno longer be optimal and other filters like the extended Kalman filter canperform better.

In this chapter the extended Kalman filter will be treated, because themodels derived in chapter 5 are non linear and the extended Kalman fil-ter is designed to handle that property. The difference between the normalKalman filter and the extended Kalman filter is the derivation of the jaco-bians in every sample, to suppress the nonlinearity of the system model. Thisdoes however make the system error covariance matrix Qk an approximationand hence the filter estimate is no longer optimal. In a lot of practical casesthis does not pose a problem, since the filter will show a good performanceany way - but this cannot be guaranteed. Using the basic Kalman filterthe noise model used is Gaussian white noise with zero mean. This noisemodel has simple mathematical properties, which is why it is used in thebasic Kalman filter [Hendricks et al., 2003].

The Kalman filter exist in both a continuous and a discrete form. Thefilters implemented in this report is in discrete form, since all data are avail-able in discrete time and the on board computers are to be used for thecalculations. The discretization usually requires some considerations whenchoosing e.g. sample time, but in the case of the current robot system,the sample times has a lower limit defined by the Linux operating system.

37

Page 52: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

6. Kalman Filter 6.1. Discrete Extended Kalman Filter

Since the estimation is usually run at a higher frequency than the controlalgorithm, the 10ms loop time of the Linux operating system is the bestobtainable solution.

6.1 Discrete Extended Kalman Filter

Here a short introduction to the extended Kalman filter is given. No deriva-tions are made in this text, so for a more detailed treatment of the Kalmanfilter refer to [Hendricks et al., 2003]. The general nonlinear discrete systemmodel is given by equation 6.1 and 6.2, where the xk is the system statevector, uk is the control signal vector, vp is the state noise vector and vm isthe measurement noise vector. The noise is defined as Gaussian white noisewith zero mean.

xk+1 = f(xk, uk, vp) vp ∈ N(0, Rp) (6.1)

yk = g(xk, uk, vm) vm ∈ N(0, Rm) (6.2)

This representation cannot be use directly to calculate a filter, but whenlinearized with respect to xk, uk and vp by retrieving the jacobians, thefollowing system emerges.

xk+1 = Fkxk + Gkuk + Gvpvp (6.3)

yk = Ckyk + Dkuk + Gvmvm (6.4)

The definition of the jacobians used in equation 6.3 and 6.4 are definedas follows.

Fk =δf

δxk

xk=x+

k,vp=0

Gk =δf

δuk

xk=x+

k,vp=0

Gvp =δf

δvp

xk=x+

k

Ck =δg

δxk

xk=x+

k,vm=0

Dk =δg

δuk

xk=x+

k,vm=0

Gvm =δg

δvm

xk=x+

k

With the linearized equations it is possible to solve the discrete versionof the Lyapunov equation which minimizes a least squares index, effectivelyminimizing the variance on the output. This can be done in two differentways the open and closed form which will both be explored.

Open form

The open form Kalman filter is is made up of two steps, where the initialprediction in principle should give a marginally better performance of theopen form over the closed form. This is due to the updated informationused for the further calculations. The first step updates the old estimate

38

Page 53: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

6. Kalman Filter 6.2. Choosing a Filter

with the new available information, thereby improving the estimate of thesystem states. The second step corrects the estimate using the Kalman gaincalculated using the updated error covariance matrix from the prediction.The last equation of the correction is the time update of the error covariancematrix.

Prediction

x−k = Fk−1x+k−1 + Gk−1uk−1 (6.5)

Q−k = Fk−1Q+k−1F

Tk−1 + Gvp k−1vpG

Tvp k−1 (6.6)

Correction

Lk = Q−k CTk [CkQ

k CTk + Qm]−1 (6.7)

x+k = x−k + Lk[yk − Ckx

k ] (6.8)

Q+k = [I − LkCk]Q

k (6.9)

Closed form

In this form of the Kalman filter the prediction and the time update iscombined into one action.

Lk = FkQkCTk [CkQkC

Tk + Qm]−1 (6.10)

xk+1 = Fkxk + Gkuk + Lk[yk − Ckxk] (6.11)

Qk+1 = (Fk − LkCk)QkFTk + GvpvpG

Tvp (6.12)

6.2 Choosing a Filter

In [Larsen, 1998] and [Nielsen & Mejnertsen, 2006] the estimation algorithmused is slightly modified. The algorithm used is an adaption of the aboveopen form with non linear prediction and the a normal linear correction.No explanation is given as to why this solution is chosen. An investigationof sensor fusion based on both algorithms documented here, using a simpletest program, simulating an Ackerman steered vehicle driving in circles.

From figure 6.1 and figure 6.2 it is obvious that there is a problem withthe linear extended Kalman filter. The poor result of the linear estimate isdue to the linearization of the prediction, since this is the only part of thealgorithm that differs. Time has not been spent researching this problemin detail.

The result of this test has lead to the decision, that the extendedKalman filter with nonlinear prediction and linear estimation will be used

39

Page 54: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

6. Kalman Filter 6.3. Sample Time and Delay

−10 −5 0 5 10

0

2

4

6

8

10

12

14

Ackerman test of estimtion algorithm − position

Internal x−coordinate

Inte

rnal

y−

coor

dina

te

GPS measurementNonlinear predictionReal positionLinear prediction

Figure 6.1: Difference inthe estimate of position,using linear and nonlin-ear prediction method.

0 500 1000 1500 2000−4

−3

−2

−1

0

1

2

3

4Ackerman test of estimtion algorithm − angle

Time [10ms]

Ang

le [r

ad]

Nonlinear predictionReal positionLinear prediction

Figure 6.2: Difference inthe estimate of angle, us-ing linear and nonlinearprediction method.

for the sensor fusion, like in the previous projects. Even though the linearprediction method has not proven to be optimal, it is still interesting andwill be implemented in the software for completeness and testing purposes,like the one just described.

6.2.1 Linearized Models

The Kalman filter that is to be implemented is based on a linearized repre-sentation of the models in chapter 5. The linear equations can be found inappendix F p. 145 and appendix G p. 153.

6.3 Sample Time and Delay

When working with the odometry and GPS measurements, it is necessaryto handle the difference in sample time. With the HAKO tractor it is nota problem since the sample time of the odometry and GPS are identical at0.1 s. In that case, no special care has to be taken, as long as the GPSmeasurement is valid. The odometry and GPS measurement is fused inevery sample. The MMR how ever is different, because the odometry hasa sample time of 0.01 s and the GPS a sample time of 1 s. In this casethe the estimate is updated every 0.01 s with the information from theodometry via the prediction. When the GPS measurement is received everysecond, the measurement is used to update the estimate in that particularsample via the correction.

Handling of delayed measurements is not treated in this report, buta solution to dealing with this property is treated in [Larsen, 1998].

40

Page 55: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

6. Kalman Filter 6.4. Summary

6.4 Summary

Having surveyed the literature and tested the basic algorithms, the filterchosen to be implemented for the sensor fusion algorithm is the open formKalman filter, with a nonlinear prediction and a linear correction of theestimate. The noise used with the models will be uncorrelated Gaussianwhite noise, using a noise variance that proves the filter to be acceptablewhen driving the robot. This selection will be done by hand tuning sincechanges in vehicle dimensions, track conditions and sensor input conditionshas great influence. This means that the filter will not be optimal in theway the literature describes it, but the main idea of using this sensor fusionis to improve the estimate of the position and heading.

41

Page 56: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1
Page 57: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Chapter 7

Software

This chapter is an introduction to the software developed during this project,with an emphasis on design considerations, configuration and special algo-rithms used. The complete source code and simulations run during theprogram can be found on the CD in appendix J.

7.1 Main Guidelines

The software developed in this project has two major reference groups, EndUsers and Developers.

End User The finished product is to be easily accessible to the end userand not require a lot of routines to get it to work. Changes in theconfiguration is to be done from documented XML files and daily useshould be straightforward.

Developer The developer should be able to get a quick overview of thecode from the documentation. The current solution is not user friendlysince the documentation is sparse, and the addition of in-code docu-mentation that is easy to maintain is of the essence for future work.Developing new routines is made easy by the structure of the imple-mented solution, but knowledge of C-programming and control theoryis essential.

Structurally the programs will be kept in the line of the current programs,reusing proven solutions and libraries. The main structure of the robotcontrol software was depicted in figure 4.1, but to refresh the readers mindit is depicted again here, in figure 7.1.

Three major components has been implemented in this project, and theywill be described here starting from the top of the structure in figure 7.1.

43

Page 58: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

7. Software 7.2. Robot Control Program

Figure 7.1: Structure of the connections between the project compo-nents.

7.2 Robot Control Program

IAU is updating the robot control program called SMRdemo on a continuousbasis, and the software has undergone several changes since is was conceived.The most recent change is addition of the HAKO tractor support, but like allother products that develop in an iterative way, structure and source codebecomes less stringent. Flow charts of the original code has been drawn upto visualize the program and data flow, see appendix E pp. 136. Analysisof the structure in the original SMRdemo has shown, that changes has tobe made, if the estimators are to be called from one section of the code.Flowcharts of the initialization and configuration show, that they do notneed restructuring to cope with the implementation of the Kalman filter.To keep in line with the overall design consideration for the robot controlsoftware, the configuration will be ported to XML. The main reason fordoing this is the need for comments in the configuration file and a robustparsing method.

7.2.1 Structure

The new implementation is a proposal as to how one can improve the flow ofthe program. Currently the code is split up into sections where the differentvehicles are separated and even though much of the actions are the same,they are performed in several places. The idea with the new structure ofthe code is to group actions instead, making the overall flow more visible.This calls for a better configuration of the robots, since the robot platformconfiguration and sensors available, are now accessible in all four sections ofthe new main loop. Another benefit from these changes is better handlingof initialization errors or sensor malfunction through flags indicating thepresent status of the system.

44

Page 59: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

7. Software 7.3. Kalman Filter

Figure 7.2: Structure of old and new SMRdemo program.

7.2.2 Configuration

The configuration of the robot control program has been converted to XMLfiles, heavily inspired by the old file format. All parameters has been keptwith the same name and structure in order to make the transition as subtleas possible. An example of the file format can be seen in appendix G.3.2,which is a XML configuration file for the MMR.

7.2.3 Summary

Restructuring the main loop of the program is invisible to the End User, butconfiguration of the system by XML will improve the user friendliness of theprogram. The Developer is the one that benefits the most from the solution,since the program has become sectioned into logical actions, performed whendealing with mobile robots. Finally the sections of the code that has beendeveloped is now documented, so that documentation can be generated withDoxygen.

7.3 Kalman Filter

The implementation of the sensor fusion Kalman filter spans three areas -Matrix calculations in C-code, Kalman filter algorithms and configuration.The matrix library is a critical component, since it is to be implementedin the real time robot control software. The Kalman filter library has beenimplemented with both open loop and closed loop text book Kalman filteralgorithms. For the estimation used in this project, a nonlinear prediction

45

Page 60: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

7. Software 7.3. Kalman Filter

method has been included for both the Ackerman and differential vehicles.Structure can be seen in figure 7.3.

Figure 7.3: Structure of sensor fusion library.

In compliance with the design consideration, a small test program hasbeen used. A simple simulation of an Ackerman steered vehicle, running witha constant steer angle has been programmed. The noisy GPS measurementimplemented, is a C random number generator driven noise source, that isadded to the true position.

7.3.1 Matrix library

Kalman filtering is based on matrix calculations, and this is not a featureavailable in the standard C libraries. An external library is needed to makethe calculations needed, but since there are many alternatives a compre-hensive study was made. The complete result of the study can be found inappendix E, but a short recap of the result will be presented here. The studyis aimed at bringing matrix and vector calculations to the institute toolbox.The process of choosing the right library is done in two stages. Firstly, thetechnical part is dealt with. Secondly, the subjective evaluation of the userfriendliness. The toolboxes are evaluated on the following properties.

• Real time compatible.

• Mature product, meaning no development version or single personsupport.

• Should contain the basic matrix operations.

• Working with the library should be intuitive.

Five matrix libraries were considered, which are listed in table 7.1. Thetesting gave two runners up, GNU Scientific Library and IAUmat. IAUmat

46

Page 61: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

7. Software 7.3. Kalman Filter

Name Reference

OpenCV [OpenCV, 2006]GNU Scientific Library [GSL, 2006]Meta Matrix Library [Frochte, 2006]The Matrix Template Library [Gottschling, 2006]IAUmat [Ravn et al., 2006]

(BLAS)T [BLAS, 2006]

Table 7.1: Matrix library study - candidates

is a matrix library developed by staff at IAU, and it has one quality thatseparates it from all others. It is intended for real time computation, andfeatures options like embedded support. Implementation of code using theIAUmat library is very user friendly, which the example code on page 128shows. One drawback of IAUmat is the lack of specialized external libraries,like linear algebra and external maintenance. GSL proved to be reliable,but not as intuitive as the IAUmat solution, see page 133. GSL has avery extensive feature list, but it suffers from lack of documented real timecapable algorithms. The implementation of the Kalman filter is thereforedone in the IAUmat matrix library. The documentation of IAUmat has beenproof read, and ported to Doxygen. This is a major step in the direction ofmaking it more user friendly and visible.

7.3.2 Configuration

Configuration of the Kalman filter module is done with a configuration sec-tion like the one seen in figure 7.4, which is placed in the robot XML configu-ration file. This section of the configuration code lets the End User tune the

<filter

type="EKF"

run="1"

use="0"

measurementnoise_std_x="0.1"

measurementnoise_std_y="0.1"

processnoise_std_angle="0.01"

processnoise_std_dist="0.02"

>

</filter>

Figure 7.4: Configuration example for the odometric Kalman filter- HAKO tractor.

47

Page 62: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

7. Software 7.4. Simulator

filter by changing the weights on the noise that has been modeled. Programflow is configured by two flags, telling if the filter is to be run and if the resultfrom the filter should be used in the operation of the robot. The flags can bemodified while running via SMR-CL commands, see appendix E. The nam-ing convention has been kept the same as in [Nielsen & Mejnertsen, 2006]in order to keep backwards compatibility.

7.3.3 Summary

Kalman filter algorithms to provide sensor fusion capability of the toolboxhas been implemented, along with the basic textbook open and closed formKalman filter algorithms. The result is a reimplementation of the HAKOtractor sensor fusion and a new implementation of a sensor fusion algorithmfor the differentially steered MMR robot.The basic tool used is the real time capable IAUmat matrix library, which isan improvement over previous work like [Nielsen & Mejnertsen, 2006]. Fromthe End Users point of view, this implementation of the Kalman filter usinga real time library will not be visible under normal operation. Only insituations where the system is running very high loads or under very longperiods of continuous operation, will this be visible. For the Developer thenew implementation is a useful tool, since it improves the readability ofalgorithms and documentation.

7.4 Simulator

The simulator is a valuable tool when it comes to designing Kalman filtersand developing the support code to get it integrated with the SMRdemoprogram. Being able to simulate the robots saves tremendous time, butis not a complete substitute to running the real life tests. The simulatorstructure will be briefly described, and additions to the existing simula-tor will be shown. For an in depth treatment of the simulator refer to[Nielsen & Mejnertsen, 2006].

7.4.1 Structure

The simulator is built upon socket connections between the robot controlprogram and the external sensors. Several robots and sensors are availableand especially the HAKO tractor is implemented with great detail. To givean idea of the structure the main building blocks are depicted in figure 7.5.

No structural changes has been made in the simulator, but a new GPSsensor has been added to the system. The handling of the socket connectionshas been revised, since errors were encountered when running the commu-nication between the GPS and SMRdemo.

48

Page 63: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

7. Software 7.4. Simulator

Figure 7.5: Structure of robot simulator.

7.4.2 GPS Sensor

The GPS sensor is constructed using a socket connection to communicatewith SMRdemo. The reason for doing this is not to be restricted to theSMRdemo protocol, and changes are easier to implement if the interfaceprotocol needs adjustments.The protocol for the GPS connection supports sending all GPS data exceptfor individual data about the satellite pose in space. The connection fromthe simulator to SMRdemo is configured in the XML configuration file forthe individual robots. The SMRdemo connection to the simulator is config-ured in the configuration file for the individual robots.The GPS signal is simulated as the real position of the vehicle in the simu-lator added white noise of a gaussian distribution. The variance of the noiseis configurable from the vehicle XML file, where the sensor is defined, seefigure 7.6.

<sensor

type="gpsmouse"

classname="GpsMouse"

id="200"

resolution="0.01"

northingoffset="0"

eastingoffset="0"

northingstderror="1"

eastingstderror="1"

port="9500">

<position x="0.4" y="0.0" />

</sensor>

Figure 7.6: Simulator configuration example for the GPS.

49

Page 64: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

7. Software 7.5. GPS Client and Server

7.4.3 Summary

The simulator has proven to be a valuable tool for design of modules andsensor fusion algorithms. Adding a new sensor to the simulator was struc-turally simple, but problems was encountered with the built in communi-cation between the modules. A solution has been implemented that makessimulations possible under some constraints, but a more structural analysisand mapping of communication methods should be carried out, to identifythe underlying problem.To the End User configuring and running the simulator is made simple bythe XML configuration. Predefined environments are available, making therobot simulator easy to get started with.To the Developer the simulator is a rapid tool for developing and simulatingrobots and even imaginary sensors. There are some minor design flaws inthe placement of information, but it is minor details which can be found inthe bug list, appendix B.

7.5 GPS Client and Server

To be able to use the consumer grade GPS, it is necessary to convert themessages from the GPS to a more suitable format for use in mobile robotics.Working with robots it is more convenient to work with the UTM coordi-nate system over the maritime Lat/Lon coordinate system. A GPS serversurvey was carried out to see if an existing solution could be found, whichis placed in appendix H. The main result of the survey is briefly describedhere. An in house server is available, developed in [Holmgaard, 2004] buttesting it showed an error in the coordinate conversion and difficult multicast socket connection requiring a special network configuration. Time wasspent investigating a solution, but it was decided not to use the server.Instead a new client/server solution is developed. Not to spend tomuch time on programming the solution the basic hardware server from[Nielsen & Mejnertsen, 2006] is used as a template, since it has proved to bestable in communicating with SMRdemo and has a working connection tothe simulator. To make comparisons to real world data possible, a correctLat/Lon to UTM conversion algorithm is implemented.

7.5.1 Structure

The GPS parser program UTMgpsd is constructed as a client/server pro-gram using a socket to connect the two parts. This is to be in compliancewith the general design considerations, making sockets the main form ofcommunication. The two pieces of software are a server that handles allcollection of GPS data and coordinate transformation, and a library thatmakes the connection and reception of the data on the client side easy. The

50

Page 65: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

7. Software 7.5. GPS Client and Server

structure of the GPS parser can be seen on figure 7.7. One benefit from mak-ing this library, is the opportunity to make a client to present the receivedGPS data and to use for it for debugging.

Figure 7.7: Structure of GPS server and client.

7.5.2 Server

The backbone of the server is the Lat/Lon to UTM coordinate transforma-tion algorithm. The algorithm is extracted from a web based conversiontool1, since well documented conversion algorithms are not easy to come by.The code is available in appendix H.7 p. 169. The algorithm promises submeter precision, and test of the algorithm shows that it is precise in bothUTM zone 32 and zone 33, which are the zones of interest when driving inDenmark. The server is like all other modules configured via a XML file.The configuration file makes it possible to configure parameters like UTMzone, client socket port and device ID for the GPS sensor, see figure 7.8.

7.5.3 Client

The core of the client side is the GPS library, that is used to connect to theGPS server. The library libgps contains the relevant functions to connect,read data, write data and disconnect from the server. Configuration of theclient is done i XML. The syntax is very much like the server side, but hasextra parameters to be used in flow control when called from the robotcontrol code, see figure 7.9.

1http://www.uwgb.edu/dutchs/UsefulData/UTMFormulas.HTM

51

Page 66: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

7. Software 7.5. GPS Client and Server

<?xml version="1.0" ?>

<!--

Configuration file for UTMgpsd

This file is for the Server

post: Port number

SBAS: Define if the WAAS/EGNOS augmentation

system is supposed to be turned on/

off [0/1]

port: Linux device name where the GPS is

connected

zone: Specify UTM zone (32 or 33 for DK)

<UTMgps>

<!-- Server parameters -->

<server

port="9500"

SBAS="0"

/>

<!-- GPS mouse parameters -->

<gps

port="/dev/ttyUSB1"

zone="32"

/>

</UTMgps>

Figure 7.8: GPS server configuration example.

7.5.4 Summary

New Lat/Lon algorithm and simpler socket connection makes the GPS servera good extension of the existing robot sensors at IAU. The precision of theLat/Lon conversion is tested against the on line maps GoogleEarth andKMS.dk. The result shows the promised precision of the algorithm of 1 mto be plausible, considering the noise and drift of the GPS position. TheEGNOS feature of the satellite has been made available in the GPS, whichshould give better performance. Tests has been carried out, but no con-clusive result was obtained. The precision was the same for the two GPSmodes, in the relatively short time period of 5 - 10 minutes in which thetest was carried out. If the configuration is set up properly on the vehicleor the simulator, the GPS interface is not visible for the End User. If smallchanges are needed, like port number or GPS device ID the XML formatmakes it easy. Implementation of a new GPS server and library has madeconfiguration and use of the library simple for the Developer. Configuration

52

Page 67: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

7. Software 7.5. GPS Client and Server

<!--

Information to connect to the UTMgpsd server

hostname: specifies the hostname of the server

where the UTMgpsd server is running

post: Port number

SBAS: Define if the WAAS/EGNOS augmentation

system is supposed to be turned on/

off [0/1]

-->

<gpsmouse

hostname ="localhost"

port ="9500"

SBAS ="0"

run ="1"

use ="1"

/>

Figure 7.9: GPS server configuration example.

of the server and development of applications using the GPS library is madeeasy, using extensive documentation and example programs available on theCD in appendix J.

53

Page 68: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1
Page 69: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Chapter 8

Test

The test is carried out in two steps, off-line test on the simulator and reallife tests. The off line tests in the simulator are done to tune the filtersand to make it plausible that they work in real life. The result in thesimulator should be relatively representative, if the GPS measurementsare modeled correctly compared to the data collected from real lifetests. The real life test is both a validation of the noise models used,and a good indicator as to the validity of the simulator as a filter tuning tool.

The two robots used in the tests are the HAKO and MMR since theKalman filter is to be ported to the HAKO tractor and a new filter has beenimplemented for the differentially steered vehicles. The HAKO is testedfirst, but time constraints has meant that the test is only done off-line.This is not optimal, but it will give a good indication as to the expectedperformance of the filter [Nielsen & Mejnertsen, 2006, pp. 66]. The MMRis the most accessible vehicle, that is available for small experiments inthe close vicinity of the IAU building. The ultimate test of the filter isconducted in a public park called Dyrehaven. This test was carried outat the very end of the project, in collaboration with Ph.d. Jens ChristianAndersen from IAU. Extra tests and large plots are available in appendix C.References will be made to this appendix in order to comment on specialcases and observed problems. In the figure legend sensor fusion will bereferred to as SF.

As mentioned in chapter 5 there are two different coordinate systemsto keep in mind, the absolute GPS UTM coordinate system and the relativeinternal coordinate system of the vehicle. In the following plots the internalvehicle path will be corrected to start in the recorded absolute positionand heading. This will give a more realistic view of the results achieved.However, t should be kept in mind that the Kalman filter still have to

55

Page 70: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

8. Test 8.1. HAKO

correct the offset in position and heading.

When doing real life tests, the lack of an absolute true position andheading, is a major problem when conducting test with the vehicles. Nomatter what type of GPS is used and the calibration time spent on thegyro, the absolute true position cannot be measured in real life testswith available equipment. To compensate for this, the visually recordedtrajectory of the vehicle path is put into the graphs, to give idea of the realworld position. Path the recorded as the starting, middle and end pointand a smooth path is then fitted using these three points.

−10 0 10 20 30 40 50 60 70

0

10

20

30

40

50

60

70HAKO initial seed test

World Easting [m] +708065 (UTM32)

Wor

ld N

orth

ing

[m]

+61

7416

4 (

UT

M32

)

GPS measurementsSF position − 0degHAKO compensated positionHAKO internal positionSF position − 45deg

Figure 8.1: Simulation of HAKO running straight run to check forconvergence of the filter - Position plot.

8.1 HAKO

The tests are ideally performed on both simulator and the real tractor, buttime constraints has reduced the testing to running simulations, in order tomake it plausible that it runs on the HAKO tractor in real life.To recap the construction of the Kalman filter used, the filter is running syn-chronous prediction and correction with a 0.1 s update speed. The varianceshas been chosen rather large compared to the theory, but they have provento perform well. To check if the filter converges as expected, the HAKOtractor is tested with a straight test run. The tractor is driven 10 m with

56

Page 71: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

8. Test 8.1. HAKO

the filter off and then 50 m with the filter on. The result is very satisfactory.Figure 8.1 shows the position and figure 8.2 shows the angle. The filter po-sition, figure 8.1, converges very smoothly after about 12 s and 1.5 m ofdriving, and shows good results as to the capabilities of the sensor fusionalgorithm, when running on the HAKO tractor. The absolute heading figure

0 10 20 30 40 50 60 70 80−0.5

0

0.5

1

1.5

2

2.5

3HAKO initial seed test

Time [s]

Veh

icle

hea

ding

[rad

]

HAKO internal angleHAKO compensated angleSF angle − 0degSF angle − 45deg

Figure 8.2: Simulation of HAKO running straight run to check forconvergence of the filter - Heading plot.

8.2 of the vehicle also converges very nice, and does not show any tendencyto over shoot the absolute angle. In essence the filter is very well behavedthus no adjustments to the Kalman filter design parameters will be given.

To be able to compare the simulated result with the test run used in[Nielsen & Mejnertsen, 2006], the same three test runs performed in thatthesis are run in the simulator. One of the tests, the so called fish tail ma-neuver where the tractor shifts its track and direction by turning, reversingand turning - much like a three-point turn in a car - is run as the main testexample.

As can be seen from figure 8.3 the estimate of the position is almostspot on from the beginning, no matter what the starting angle of thefilter is. The maximum easting error is 7 cm which is hardly visible onthe figure. This is due to the choice of large starting value for the errorcovariance matrix Qk and the low noise on the GPS measurement. Thevalue of the initial estimate has been corrected to be close to the start ofthe test starting point, in order to make the plot visually identical to theones found in [Nielsen & Mejnertsen, 2006].

57

Page 72: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

8. Test 8.1. HAKO

−10 −5 0 5 10 15 20 25 30 355

10

15

20

25

30

35

40

45HAKO with RTK−GPS and odometric filter − fish tail

World Easting [m] +708090 (UTM32)

Wor

ld N

orth

ing

[m]

+61

7417

0 (U

TM

32)

GPS measurementsSF GPS and odometry

Figure 8.3: Simulation of HAKO running fishtail maneuver - Po-sition plot. The result is visually identical to the one found in[Nielsen & Mejnertsen, 2006, p. 68].

This test does not conclusively show, whether the new implementa-tion is better, equal or worse than the original. This may only be provenby running extensive test on the HAKO tractor in real life. The othersimulations of the HAKO tractor have been placed in appendix C asfigure C.1 and figure C.2. All simulations of the HAKO tractor show thesame good result as the original filter. No significant problems have beendetected, and the sensor fusion library based on IAUmat proves that itworks as well as expected on the HAKO tractor.

All the testing has been done running the HAKO with the intendedsoftware. Running the system with an external configuration file reveals,that there are some minor shortcomings of the filter implementation.The test is done by driving the vehicle 10 m with the filter off, 50 mwith the filter on, waiting 600 s and then drive another 50 m still withthe filter on. The problem occurs when the HAKO is stationary for anextended period of time during a run. The result is a drifting estimatedposition that causes the following movement command to fall short ofits mark by 7 m, see figure 8.4. This happens even if the movement isdone with a command that tells the vehicle to move to a specific coordinate.

58

Page 73: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

8. Test 8.1. HAKO

−20 0 20 40 60 80 100 120

0

20

40

60

80

100

120HAKO wait test

World Easting [m] +708065 (UTM32)

Wor

ld N

orth

ing

[m]

+61

7416

4 (

UT

M32

)

GPS measurementsSF position − 0degHAKO compensated positionHAKO internal position

Figure 8.4: Simulation of HAKO waiting between two runs.

This test is not valid for normal operation of the HAKO tractor,since this combination of commands does not appear. The reason it iscarried out, is because it was not documented in the original report, howthe system could handle stationary estimation. A solution to the problemhas not been sought, since the HAKO tractor at the moment does not makeuse of this particular combination of commands.

59

Page 74: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

8. Test 8.2. MMR

8.2 MMR

The sensor fusion algorithm and Kalman filter parameters have not beenused before in the current form. Benchmarks and algorithm tests are thusnecessary to evaluate the overall performance. As mentioned earlier theMMR is available for testing on DTU. It is therefore more accessible andhence easier to verify the performance of the simulated filters in real life usingthis platform. As with the HAKO, the MMR is tested in the simulator inorder to understand the capabilities of the filter and make it more likely tosucceed in real life tests.

8.2.1 Simulation

To check how the filter performs with the suggested parameters from themodel chapter, the MMR is simulated with a straight test run like the HAKOtractor. The MMR is driven 10 m with the filter off and then 50 m withthe filter on, starting the filter initial estimated angle at different values.

0 10 20 30 40 50 600

10

20

30

40

50

60

MMR initial seed test

World Easting [m]

Wor

ld N

orth

ting

[m]

GPS measurementsSF position − 90degMMR internal positionMMR compensated positionSF position − 80degSF position − 45degSF position − 0deg

Figure 8.5: Simulation of MMR running straight line to check forconvergence of the filter - Position plot.

As can be seen from figure 8.5 and figure 8.6 the result is not optimal.The initial value of the angle estimate is too far from the absolute angle andthe filter is slow to converge. The reason for this reduced performance overthe HAKO filter is most likely to come from noisier GPS measurements.This noise difference is the major change in the information quality. A

60

Page 75: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

8. Test 8.2. MMR

0 50 100 150 200 250−0.5

0

0.5

1

1.5

2

2.5

3MMR initial seed test

Time [s]

Veh

icle

hea

ding

[rad

]

SF angle − 90degMMR internal angleMMR compensated angleSF angle − 80degSF angle − 45degSF angle − 0deg

Figure 8.6: Simulation of MMR running straight line to check forconvergence of the filter - Heading plot.

performance loss during simulation was expected, but not as massive as this.Optimization of the Kalman design parameters is possible, but the appliedparameters are relatively good according to theory and are therefore usedfor the rest of the tests despite the potential performance loss. However,some tuning will be carried out as the last part of the test. Investigatingthe oscillation has been conducted in two steps - firstly on the noise on GPSmeasurements and secondly on the initial heading.

61

Page 76: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

8. Test 8.2. MMR

No Noise on GPS Measurements

The poor performance of the filter led to an investigation of the properties ofthe filter under ideal conditions. The GPS and odometry is simulated with-out noise and the expected result is a position and heading that converges inthe first two samples. One GPS sample to correct the position and a secondGPS sample to correct the heading. The response should be a curve form asfor the HAKO tractor angle - more sharp. The result is discouraging, butit supports the oscillations observed in the initial tests. There can be twoexplanations for the oscillation of the estimate. Firstly, the noise weights forthe Kalman filter design can be chosen in such a way that the filter becomesunstable. Secondly, there can be a problem with the sampling of data thatdelays the signals, effectively making the filter suboptimal.

0 10 20 30 40 50 600

10

20

30

40

50

60

MMR no noise algorithm test

World Easting [m] +700000 (UTM32)

Wor

ld N

orth

ing

[m]

+61

0000

0 (

UT

M32

)

GPS measurementsSF position − 0degMMR internal positionMMR compensated position

Figure 8.7: Simulation of MMR running straight line with no noiseand best simulated filter weights - Position plot.

The Kalman filter design parameters have been tuned and the oscillationhas been persistent. Finally, the communication with the GPS server wastested, but it revealed that the communication faults observed during sim-ulation meant that the GPS measurements were delayed. This causes theKalman filter to behave suboptimal, as can be seen from the simulation, seefigure 8.7 and figure 8.8. Further discussion of the communication problemwill be made in chapter 9. Unfortunately the problem was identified so latein the project that changing the communication library was impossible. Theremaining tests were thus carried out with the suboptimal communication

62

Page 77: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

8. Test 8.2. MMR

library, accounting for a significant part of the bad performance.

0 50 100 150 200−1

−0.5

0

0.5

1

1.5

2

2.5

3MMR no noise algorithm test

Time [s]

Veh

icle

hea

ding

[rad

]

SF angle − 0degMMR internal angleMMR compensated angle

Figure 8.8: Simulation of MMR running straight line with no noiseand best simulated filter weights - Heading plot.

Correct Initial Heading

A simple test with the correct absolute heading is performed to see ifa better measurement of the absolute heading will make the filter moreoptimal. A perfect initial heading improves the estimate dramatically.When the estimated angle is correct the result of the simulation is muchbetter, as can be seen in figure 8.9 and figure 8.10. The position convergesvery fast and stays very close to the compensated position for the MMR.This resembles the desired performance, but keep in mind this demandperfect odometry, which is not obtainable in real life. The difference whenworking with real life measurements are the un-modeled noise contributionsfrom both vehicle and sensors. This effect should be identifiable in the reallife tests. The estimate of the angle show the same good performance asthe position plot. The reason why the heading jumps, is a ±π limitation onthe angle logged from the robot control program.

A precise absolute heading from the start gives a very good positionand heading estimate. Adding a sensor to measure this - such as a compass- would improve the overall performance of the sensor fusion algorithm.

63

Page 78: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

8. Test 8.2. MMR

−5 0 5 10 15 20−2

0

2

4

6

8

10

12

14

16

18

20MMR with GPS and odometric filter

World Easting

Wor

ld N

orth

ting

GPS measurementsSensor fused GPSMMR internal positionMMR compensated position

Figure 8.9: Simulation of MMR running 10 m x 10 m square -Position plot. The figure shows GPS measurements, MMR internalposition, Compensated MMR position and Sensor Fused heading.The high value of the covariance matrix at start makes the Kalmanestimate converge fast as can be seen from the figure.

8.2.2 Real Life Test

Different testing environments are needed, to get an idea of how the algo-rithm works in real life. Since the MMR is not very good at driving longstretches on pure odometry, the test were carried out in collaboration withPh.d. student Jens Christian Andersen, who works on outdoor navigationusing laser scanner and camera. Initial test were run on the tarmac parkinglot, while the final test was done in the public park Dyrehaven.

Vehicle Control

The plots in the parking lot and Dyrehaven differ fundamentally on oneparameter.

The parking lot test are driven as open loop control, relying in the odom-etry to hold the course, when commanded to drive forward. This means thatthe trajectory of the internal position is straight, while the absolute positiontrajectory will curve due to un-modeled errors.

The test in Dyrehaven is driven as closed loop control, where the camerais used to guide the vehicle along the left side of the road. In this case the

64

Page 79: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

8. Test 8.2. MMR

0 0.5 1 1.5 2 2.5

x 104

−3

−2

−1

0

1

2

3

4

MMR with GPS and odometric filter

Time [10ms/tick]

Veh

iche

hea

ding

Sensor fused angleMMR internal angle

Figure 8.10: Simulation of MMR running 10 m x 10 m square- Angle plot. The figure shows MMR internal angle and SensorFused heading. The reason why the two are not identical is a π/2difference in initial heading, and as can be seen it stays that way.

trajectory of the internal position will curve, depending on the drift of theodometry and the absolute position trajectory will follow the road. Thetest road consists of long stretches that are straight.

The lack of a true absolute position and heading constitutes a prob-lem with doing the real life tests. Evaluating the GSP measurementsand estimate of position and heading are not straightforward. For thisreason no conclusive hard differences can be calculated but the trend of themeasurements may be evaluated. To accentuate this, the plots have beenexpanded with two extra representations of trajectories:

• A compensated internal position representation calculated from theodometry is added. The compensation consists of starting the internalposition recorded in the true position and heading, showing the paththe vehicle is driving according to odometry.

• The visually recorded true position is plotted, to show the trajectorythe vehicle in reality traveled. In the parking lot this is done by visualmeasurement of the runs and making records of three points on therun. In Dyrehaven the trajectory is derived from an on-line map1

1Road information is taken from kms.dk.

65

Page 80: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

8. Test 8.2. MMR

showing the road. From the recorded trajectory the heading has beencalculated to help evaluate the estimate of the heading.

Parking lot

The test run is carried out on tarmac with different gradients as the robotdrives up the lot. This proved to be rather interesting, since the directionalstability of the robot makes it challenging to make test runs exceeding50 m, from simple configuration files.

0 20 40 60 80 10040

50

60

70

80

90

100

110

120

MMR with GPS and odometric filter 600 s stationary, parking lot

World Easting [m] +720400 (UTM32)

Wor

ld N

orth

ing

[m]

+61

8750

0 (

UT

M32

)

GPS measurementsSensor fused positionkms.dk parkinglot

Figure 8.11: Real life test of MMR stationary in parking lot - Po-sition plot. The figure shows the drift of the GPS.

Stationary

The result of the stationary test indicates that there is a substantial drifton the GPS when using it in the parking lot, see figure 8.11. The varianceis 8.5 m in the northing direction and the 4.3 m in the easting direction. Inthe stationary situation the position converges nicely and shows a varianceof only 0.5 m in the northing direction and 0.2 m in the easting direction.The test of the GPS shows that the variance for the GPS measurement isset fairly low, but since the noise is reduced when the vehicle is driving, thenoise is not changed for the later tests.

66

Page 81: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

8. Test 8.2. MMR

0 100 200 300 400 500 600−0.05

−0.04

−0.03

−0.02

−0.01

0

0.01

0.02

0.03

0.04

0.05

MMR with GPS and odometric filter 600 s stationary, parking lot

Time [s]

Veh

icle

hea

ding

[rad

]

Sensor fused angleMMR internal angle

Figure 8.12: Real life test of MMR stationary in parking lot - Head-ing plot. The vehicle heading is drifting as expected, which is alsopicked up by the Kalman filter.

The direction drifts like the internal odometry. This is not correctedmuch by the GPS measurement, since the vehicle is not moving, see figure8.12. The drift is 0.06 deg/min which is very satisfactory, considering theodometry is based on a calibrated system using both wheel encodes andgyro. This is what could be expected of a calibrated system, but the lowconfidence in the GPS makes it interesting to see how it will perform whenthe vehicle is moving.

67

Page 82: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

8. Test 8.2. MMR

−20 −10 0 10 20 30 40 50 60 7050

60

70

80

90

100

110

120

130

MMR with GPS and odometric filter 1st run, parking lot

World Easting [m] +720400 (UTM32)

Wor

ld N

orth

ing

[m]

+61

8750

0 (

UT

M32

)

GPS measurementsSensor fused positionMMR compensated positionMMR observed positionkms.dk parkinglot

Figure 8.13: Real life test of MMR driving in parking lot - Positionplot.

Driving

As can be seen from the test result in figure 8.13 the filter positionconverges fine and it tends to follow the GPS measurement as it moveforward. However, the run was cut short by inconsistent performanceand parked cars. Figure 8.14 shows some convergence even though thedrive is rather short, but that is also partly explained by the initializationvalue. The filter was initialized with a good initial heading, not morethan 15 deg off the true heading. This does account for some of the goodperformance. When the GPS measurement does not drift too fast, thefilter converges to the correct value. Testing in the parking lot has showsthat driving on odometry is difficult, especially considering the confinedspace and parked cars. Several test runs were made to get acceptable results.

Appendix C summarizes more test runs. They show that if the GPSdrifts off course quickly, or if the filter is initialized with a heading far fromthe absolute heading, the filter will oscillate like the simulation showed.Odometry imperfections add to the drifting GPS measurements. This issuboptimal from a filter point of view, but it reveals that the simulator hasthe same characteristics as the real world runs. This means that if a filterperforms well in the simulator, it has a good chance of also performing well

68

Page 83: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

8. Test 8.2. MMR

0 10 20 30 40 50

−3

−2

−1

0

1

2

3

4

5

MMR with GPS and odometric filter 1st run, parking lot

Time [s]

Veh

icle

hea

ding

[rad

]

Sensor fused angleMMR compensated angleMMR observed angle

Figure 8.14: Real life test of MMR driving in parking lot - Headingplot.

in the real world tests. Correcting the communication error and testingthe performance in the simulator is still an interesting possibility. Thedrifting GPS measurements observed in the parking lot, along with theodometry, make the task difficult for the Kalman filter, especially becausetuning is difficult and unambiguous evaluation of the experimental resultsis impossible.

The test runs performed in the parking lot where short, so the nextstep is to test performance of the filter when running longer runs. Testingover long stretches accentuates problems, that risk being overlooked whendriving short test runs. Further, the test is carried out in a huge open fieldin Dyrehaven, which gives the GPS better performance.

69

Page 84: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

8. Test 8.2. MMR

8.2.3 Dyrehaven

The runs carried out in Dyrehaven, test the sensor fusion solution whendriving longer stretches and hopefully with better GPS coverage. Onlyone trip to Dyrehaven has been possible and under some constraints. Thesoftware needed to guide the MMR across Dyrehaven had a bug that madeit stop responding after some time. The result of the trip was three test runswith unscheduled stops in all of them, which made the filter drift like in thesimulations. Figure 8.15 and figure 8.16 show that the filter converges likethe simulation predicted. Using the modeled variances for the filter doesmake the filter converge, but not as fast or stable as the HAKO tractor.

0 50 100 150 200 250 300 350 400−150

−100

−50

0

50

100

150

MMR with GPS and odometric filter 3rd run, 303m

World Easting [m] +722900 (UTM32)

Wor

ld N

orth

ing

[m]

+61

8920

0 (

UT

M32

)

GPS measurementsSensor fused positionMMR compensated positionkms.dk road 6m

Figure 8.15: MMR real life test in Dyrehaven - Position plot.

The stationary test indicate that the drift on the GPS persists whenusing it in Dyrehaven. The variance is 2.5 m in the northing directionand the 1.9 m in the easting direction, significantly lower that the parkinglot test. The precision of the GPS measurements on the 6 m wide roadis better driving. Positioning the robot on the road solely using theGPS cannot be done, with precision recorded. The test in Dyrehavenshowed that the position could drift off the road when driving, seefigure C.7. Local guidance from other sensors is therefore needed. TheLat/Lon to UTM conversion algorithm shows the promised precision, tak-ing the extraction of the road information from an on-line map into account.

The estimate is not optimal oscillating like the simulations have shown.

70

Page 85: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

8. Test 8.2. MMR

0 50 100 150 200 250 300 350 400 450

−3

−2

−1

0

1

2

3

4

MMR with GPS and odometric filter 3rd run, 303m

Time [s]

Veh

icle

hea

ding

[rad

]

Sensor fused angleMMR internal anglekms.dk road angle

Figure 8.16: MMR real life test in Dyrehaven - Heading plot. Finalvariance is σ = 0.1 rad

The The variance of the stationary angle taken on the last half of therun is σheading = 0.1 rad and the largest deviation from the road is5 m with a bias to the right of the GPS measurements and the road.These figures cannot be used for reliable navigation of an autonomousrobot. Further investigations into improving the filter and sensors is needed.

The design noise for the Kalman filter on the GPS is set too highwhile driving, and the GPS wanders a lot when stationary. Also theodometry has a lot of drift that was not so visible in the parking lot.It turns out that the drift is 90 deg/min when calculating the angulardrift of the odometry. This is significantly more than the 0.06 deg/mindocumented in the stationary test. The two other test runs in Dyrehavenshow a 72 deg/min drift, which supports that the odometry is faulty. Therecan only be one explanation to the large drift in odometry. The gyro hasbeen malfunctioning or has not been active. This may also account forsome of the problems seen in the parking lot. A better calibration of theodometry would make improve the result. The internal position shows thatthe sensor fused position and angle is much better, but not adequate forfollowing a road solely on the estimate. The two other test runs and a plotsummarizing the entire test is available in appendix C.

71

Page 86: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

8. Test 8.2. MMR

8.2.4 Filter Tuning

The above variances were good guesses based on the theory, but tuning afilter is of great importance. The tuning is a trade-off between stabilityand performance. The investigation of the filter tuning was done before thegyro problem was identified. A better odometry will only add to the resultachieved in this simple test. Tuning the filter design parameters is doneafter the following empirical rules.

−40 −20 0 20 40 6050

60

70

80

90

100

110

120

130

MMR with GPS and odometric filter tuned run, parking lot

World Easting [m] +720400 (UTM32)

Wor

ld N

orth

ing

[m]

+61

8750

0 (

UT

M32

)

GPS measurementsSensor fused positionMMR compensated positionMMR observed positionkms.dk parking lot

Figure 8.17: MMR real life test with tuned parameters - Positionplot.

GPS Oscillation amplitude in the estimation does not change by modifyingthe noise variance in the Kalman filter design. The frequency of theoscillation on the other hand increases as a function of decreasing noisevariance. The position convergence is significantly faster with a lowervariance on the GPS, but the stability of the position suffers.

Odometry A higher Kalman filter design variance on the odometry givesa smaller amplitude on the damped oscillation of the estimated angle.Consequently the error in the position estimate also reduces becausethe estimated angle that is used to update the filter between GPS

72

Page 87: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

8. Test 8.2. MMR

measurements is not that far off. Lowering the variance of the odom-etry gives the opposite effect. The oscillations on the estimated anglegrows and so does the error in estimated position.

The filter has been tested in the simulator with a hand tuned filter design.The parameters have been chosen to comply better with the noise seen onthe test run in Dyrehaven. The values used in the filter design for this testare presented in table 8.1.

Source UTM easting UTM northing δθ δdk

Variance 1 m 1 m 0.1 rad 0.2 m

Table 8.1: Tuned filter design parameters.

0 20 40 60 80 100

−3

−2

−1

0

1

2

3

4

5

MMR with GPS and odometric filter tuned run, parking lot

Time [s]

Veh

icle

hea

ding

[rad

]

Sensor fused angleMMR compensated angleMMR observed angle

Figure 8.18: MMR real life test with tuned parameters - Headingplot. Final variance of the angle is σ = 0.075 rad

As a result of the lower variance of the GPS measurements, theestimated position wanders more when stationary, but has a faster andmore stable convergence to the GPS measurements when the vehicle ismoving. Increasing the variance of the odometry decreases the amplitudeof the estimate oscillations, due to less confidence in the measurements.The faster convergence and stability of the position is depicted in figure8.17. The estimated heading of the vehicle converges faster, but it wanders

73

Page 88: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

8. Test 8.2. MMR

slightly as can be seen from figure 8.18. The variance of the stable angle isbetter at σheading = 0.075 rad compared to the test run in Dyrehaven. Thefinal heading is also better compared to the observed heading. The drift ofthe odometry in this test is 27 deg/min, which is significantly better thanthe results from Dyrehaven.

What this test does not show is that the noisy conditions in the parking lotcauses the test result to fall out differently each time. A maximum error inthe position is not descriptive when running the vehicle in the parking lot,since large variations have been documented in the tests performed.

This solution is not optimal since it suppresses the odometry in orderto gain the better convergence to the GPS measurements. In reality it isa low pass filter for the GPS measurements guided by the odometry. Moreinvestigations into the MMR odometry is needed, along with calibration togain a better result.

74

Page 89: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Chapter 9

Discussion

This chapter summarizes the results obtained and puts them into perspec-tive. The first section contains the properties and user impression of thedeveloped software. Focus will be on the chosen structure and expandabil-ity to other areas of interest. The last section evaluates the results obtainedin the simulations on the HAKO tractor and the MMR as well as the reallife tests on the MMR.

9.1 Implementation and Structure

New structure and XML configuration of the robot control software hasmade it easy to configure and run simulations and real life tests on bothAckerman and differential platforms and different sensor configurations.This will be a good and stable addition to the robot control software inthe future, especially if the calibration files that the robot control softwareitself generates is ported to XML.

The sensor fusion library based on Kalman filtering has been usedfor fusing two different types of GPS and odometry, which are measure-ments of position. The models used in the sensor fusion library are limitedto estimating position, but the choice of absolute and relative sensor is notlimited by the solution. A system based on absolute sensors like beaconpositioning or camera and guide marks and relative sensors like IMU, gyroand Doppler effect speed measurement sensors, would be fully compatible.Configuration of the filter design parameters can be handled from XMLconfiguration files and only the sample time synchronization will have to bealtered.The real time matrix library used in the project has been reviewed anddocumented with tools, which make documentation available on-line andmore user friendly. This is in itself not unique, but it does make the library

75

Page 90: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

9. Discussion 9.2. Documentation

more approachable.

The simulator has proven to be a valuable tool, but also a source ofmany headaches. The initial version of the simulator proved to be inade-quate, when testing the differentially steered vehicles and communication.This meant that a considerable amount of time was spent debugging thesimulator.

A valuable feature is that it allows accelerated simulation, which isone of the main time saving features. Control over the simulation environ-ment is equally important when testing algorithms and locating problems.The simulator has been patched to enable stable running of the differentiallysteered vehicles, using two socket streams to communicate. One socket forthe robot hardware server, and one socket for the GPS hardware server.This is an experimental feature described in [Nielsen & Mejnertsen, 2006]for the laser scanner, but proved not to be as mature a product as expected.Some communication errors occur, which delay the GPS measurements.Thus the Kalman filter performance is suboptimal.

A GPS client / server solution has been developed, that delivers theinformation needed to run the sensor fusion experiment. The server iscapable of collecting and transforming all information available in the GPSmessages, and if necessary make them available in plain test on the serverside. This is better than the investigated alternatives, because the serversupports easy XML configuration and user configurable EGNOS activation.There is no need for recompilation to change UTM zone or connectionconfiguration. Like the server the client has easy XML configuration andsupports retrieving all the information needed. The satellite positions inspace are not retrieved from the server, because there was no need for it inthis application.The GPS client/server solution has one drawback. The communicationbetween them is based on a socket library that has proved to cause somecommunication glitches with the robot control software. This is mostundesirable, but time has not been spent finding a solution to this problem.A logical solution to this problem is to change the socket library, andinclude the GPS measurement input in the hardware server.

9.2 Documentation

Documentation of the developed software has been done extensively. Asoftware documentation tool (Doxygen) has been used for the task, usingin-code comments to generate on-line up-to-date documentation. The De-veloper benefits from both easy maintenance of documentation and a very

76

Page 91: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

9. Discussion 9.3. Results

useful reference tool. The result is on-line documentation to the programsnamed in table 9.1. The complete documentation is available on the CD inappendix J.

Software Status

SMRdemo PartialFilter library CompleteGPS library CompleteGPS client / server CompleteSimulator Updated

Table 9.1: Doxygen documented software, developed / edited in thisproject.

9.3 Results

A short discussion of the results on the HAKO tractor is done first. Evenmore interesting is the investigation of results obtained driving with theMMR; why does it have an estimate that is so inferior to the HAKOtractor? This will be treated secondly.

The result of the position estimate on the HAKO tractor is very sat-isfactory. It shows that the new sensor fusion algorithm implemented withthe IAUmat real time matrix library is likely to succeed in driving theHAKO tractor as well as the current version. The benefit of using the newIAUmat solution compared to the current OpenCV based solution is that itprovides a more real time compatible solution with less compilation errorsthan generated by the OpenCV library. The HAKO uses a high-precisionRTK-GPS. This shows in the estimate of the position and heading whichi very stable. A precision such as this would be sufficient for autonomousnavigation on normal roads.The main benefit of the simulations of the HAKO tractor is the confirmationthat the sensor fusion algorithm and the new implementation performs asexpected.

The MMR has not been showing as good results as the HAKO trac-tor. Looking at the results from the simulations and real life tests, the mainproblem is the slow convergence of the estimated angle. The only sourceof absolute position information is the GPS, whether it is a RTK-GPSor a standard GPS. The absolute heading cannot be measured with adedicated continuous sensor on the current MMR. This means that infor-mation on the absolute angle come from the progression of the GPS position.

77

Page 92: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

9. Discussion 9.3. Results

While in theory it was a good idea to rely on the position change tocontain all the heading information, it failed to meet the expectations ofsmoothing the estimated trajectory. The reason why the absolute headingtakes so long time to converge has two sources. Firstly, the precision ofthe absolute sensor used and the speed at which the vehicle is traveling.Secondly, the confidence in the odometry is set relatively low, by specifyinga large variance in the design of the Kalman filter. The illustrationpresented in figure 9.1 provides a clear picture of GPS measurements withdifferent noise sizes and indication of angular uncertainty.

Figure 9.1: Illustration of the heading information contained in theprogression of the position.

The figure depicts the uncertainty of the heading information, based ontwo consecutive GPS measurements one meter apart. The precision on theGPS has a dramatic impact on the uncertainty of the heading. There aretwo fundamental ways of improving the precision of the heading information.Firstly by improving the precision on the GPS equipment, and secondly byincreasing the speed, which will move the measuring points further apart.This effectively creates a bigger confidence in the angular information re-trieved.

However, it is not possible to improve the GPS measurement of the MMRin either of the ways. The only option is to use the odometric heading infor-mation, which is the backbone of the entire sensor fusion idea. Sufficientlyprecise odometric information of the vehicle will add to the precision ofthe GPS by giving directional stability and relative heading information be-tween GPS measurements. It will also keep the vehicle heading over severalGPS measurements. This effectively make the estimate better, somewhatlike increasing the speed. With a very good relative measurement, usinga high precision gyro, a drift of 1 deg/hour is obtainable, whereas the oneused in the MMR has a drift of 1 deg/min. However, the choice of sensoris hampered by price, precise gyros are costly at around e6500.

Drift is visible in all the test runs undertaken with the MMR, asdocumented in chapter 8. To see the drift when stationary, the gyro has

78

Page 93: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

9. Discussion 9.3. Results

been tested. The result only showed a drift of 0.06 deg/min. This is goodperformance for a cheap gyro, but it attributed to calibration1. Checkingthe drift of the MMR when doing test runs does not show the same results.An odometric drift of up to 90 deg/min has been recorded, which is not incompliance with the specifications for the gyro on the MMR, even if run invery different environments with accordance to temperature. As mentionedin chapter 8 the drift can only come from problems with the gyro that havenot been fully identified. The discovery of the problem was made late inthe project process. It has therefore not been possible to finf the time torun additional test to investigate a solution.

The measurement quality may also be improved by adding additionalinformation from available sensors, or by adding extra sensors to improvee.g. the heading information. The use of the heading supplied by theGPS sensor has been explored in [Blas & Riisgaard, 2005]. However, thevariances used in the design of the Kalman filter σGPS position = 0.05 mand σGPS heading = 3 deg does not at all resemble the values available fromthe GPS documentation or the data collected from the GPS in stationarytests. Discussing the result with the supervisors confirms that the use ofthe GPS absolute heading is not optimal for estimation.

Adding a magnetic compass to the platform has been tried in an-other project. However the compass is no longer functioning and theproject report is not available at the library. It is therefore not possible toassess the precision of the magnetic compass. In theory it should give theabsolute heading of the vehicle. Off course this requires a proper shieldingfrom the robot. Commercial magnetic compasses promise a ±1 degaccuracy on the heading, but like the gyro they come at a considerable cost.

The noise model of the GPS has proved to be to simple. Logs ob-tained in the testing of the GPS client/server solution and the real life testof the MMR show a dramatic change in the noise characteristics. When theMMR is stationary the noise is a random walk within the specified intervalfor the GPS receiver. When driving the spacing is correct according to thespeed commanded but the measurement still wanders orthogonally to themovement. The GPS receiver usually contains an internal Kalman filter tocombine the information received from the satellites which makes addinganother filter on top more cumbersome.

The Kalman filter used in the experiments is based on the assump-tion, that the modeling of the noise is accurate and that it does not changeunder operation. The tests undertaken indicate, that the noise is in fact

1Calibration performed recently by Ph.d. Jens Christian Andersen

79

Page 94: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

9. Discussion 9.3. Results

much more complex and highly dependent on the reception of the GPSsignal. The accuracy and the precision is low when doing testing in theparking lot even though the signal is received from 6 different satellites.Signal reception is better in Dyrehaven, showing a sightly better precisionand much higher accuracy, as documented in figure 8.15. Standard con-sumer GPS is not adequate to get a good estimate, if the noise registeredwhen stationary is to be used in the design of the Kalman filter.

It is possible to optimize the Kalman filter if the knowledge of thevehicle dynamics is taken into account. A hybrid algorithm might be apossible solution. The knowledge of a stationary vehicle from the odometrymeans that a wandering gyro or drifting GPS position are suppressed inthe estimation algorithm. A simple test of this addition could be to stopthe estimation when stationary (no odometric movement) and run theestimation when driving the vehicle. This approach is also supported bythe HAKO test, which show that running the filter when stationary rendersthe estimate useless.

Evaluating the work according the to the consequences listed in theintroduction shows that the documented problems needs solving, to live upto the theoretical capabilities of sensor fusion. All in all the problems withthe implementation dwarfs the positive aspects of the system, not showingbetter performance of the over-all system or a significant improvementof all sensors. The estimate does perform better than the odometry, butsome troubleshooting is needed to surpass the raw GPS measurement. Asmentioned, redundancy is a feature, but that calls for redundant sensors. Nosuch backup sensors are available, and ti has thus not been explored. Thegyro and odometry could be redundant with respect to the measurement ofthe heading, but no evaluation of the effect has been investigated.

Design time has definitely gone up, due to the fact that the requirementsfor design, implementation and tuning have been considerable. Modelingthe sensors, while simple in theory requires experimental verification. Testsshow that simple Gaussian white noise with zero mean is not sufficient tohandle GPS and odometry modeling.

80

Page 95: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Chapter 10

Conclusion

A real time sensor fusion library that supports both Ackerman anddifferentially steered vehicles has been developed for the robot controlprogram. Sensor fusion is achieved using extended Kalman filtering witha constant noise modeled as Gaussian white noise with zero mean. Thesensor fusion library is designed for estimating position and heading inan absolute coordinate frame, relying on one absolute and one relativeposition measurement. Configuration of the filter is done externally withconfiguration files in the XML language.

Changes and additions to the existing robot control software havebeen necessary, to support the implementation of the sensor fusion library.The main robot control program has been restructured to accentuate thelogic actions involved in controlling mobile robots. The new structurehas the estimation routines grouped in one section of the code. Anadded benefit is a better handling of configuration and initialization faultsfor all configurable elements. The initialization has been ported to XMLas part of the project, in compliance with the robot control software strategy.

A new GPS client/server solution has been proposed, to make GPSsignals available to the End User directly in the hardware server. Theaccuracy and configurability via XML are improvements over the existingsolution, but problems has been encountered with the chosen socketclient/server framework. The solution has proved to delay measurements,which will need to be handled to improve performance.

An expansion to the IAU simulator has been made to enable simu-lations using the new GPS client. The solution is based on existingcommunication classes in the simulator, which has given some unexpectedproblems. Simulation is possible, but under the constraint that measure-

81

Page 96: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

10. Conclusion

ments are delayed. The differentially steered vehicles have been checked formodel consistency, but not all the promised features are implemented.

Simulations on the HAKO tractor has proven the new sensor fusiontool to be stable and have the same performance as the prior non realtime stable implementation. Simulations on the MMR using the new GPSclient/server system has not proven as good performance as expected. Theestimate is oscillatory in both simulation and real life tests. These originatepartly from the delayed measurements. Real life test has been hamperedby noisy GPS measurements and poor odometry originating from yetunidentified gyro problems. The filter does - even under these suboptimalconditions converge - improving the estimate over the internal odometricposition, but it does not surpass the raw GPS measurement. Real life testshow that GPS measurements from the commercial GPS - even with arecorded variance as low as σGPS = 2 m - is inadequate for autonomousnavigation on normal roads.

82

Page 97: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Chapter 11

Future work

11.1 SMRdemo

The robot control software has been improved with the addition of XMLconfiguration but it has not been completely converted. The softwaregenerates calibration files itself, which have not been ported to XML.The XML parser used (Expat) also supports writing XML files from datastructures, so finalizing the XML configuration should prove to be simple.

The communication with external programs via a socket connectionhas proved not to be straightforward. A common socket connectionstandard for the smrd, hakod and other robot related software would bemuch appreciated.

11.2 Simulator

The simulator needs verification of the underlying code for the modelsin order to make sure that it is correct according to the theory/robotcontrol software used. The current differential motion algorithm is notcomplete in its current form. Communication errors between the simulatorand SMRdemo have been detected when using the new socket connection.A complete solution has not been found. This builds on the need for acommon socket communication standard.

In general there are problems with the sampling time and keepingreal time speeds. The origin of these stem from the construction of thesimulator and are thus difficult to change. The simulator is not able torun two robots with different sampling times. On a single user system thisdoes not matter, but when running a multi user configuration - as the IAUservers - this limits the usability. At the moment simulation is only possible

83

Page 98: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

11. Future work 11.3. Filters

with the HAKO tractor alone or the IAU robots alone due to the currentdifference in sampling times.

11.3 Filters

The current filter solution uses the GPS and odometry as input, but othersensors are available and interesting. Especially the IMU has some benefitsover the odometry in relation to measuring the change in heading andwould thus be ideal to use in an expansion of the filter. A magnetic compassmight be a useful expansion to improve the heading precision from thecheap GPS. The implementation of the new algorithms should not prove tobe a problem with the current filter platform. The current filters should beused as a guideline. An interesting prospect is to make a numerical stablefilter for the IAU platform. UDU and Square Root filters looks promising,but they have not yet been implemented or tested.

11.4 IAUmat

The matrix toolbox is very promising and easy to work with. It has not atall received the attention it deserves in the education at IAU. Some routinesare not finished in the current version but if this is corrected it could provevery handy for future work. It is easy to see where corrections has to bemade in the code documentation under Related Pages - Todo List, .

11.5 MMR

The MMR is a good platform for out-door experiments because it has agood combination of absolute and relative sensors and the ability to travelon roads. One major thing that I have been missing when doing experimentswith the robot, is the ability to drive it using the remote control and log datain the process. This would have saved valuable time and the experimentsthat has been carried out would have been reproducible. Reproducible runsin the parking lot is not possible, due to the need for special software tokeep the vehicle on the same track. Replacing this closed loop control witha human operator would remove the need for specialized software.

84

Page 99: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Nomenclature

Abbreviations

ABS Anti Blocking System - helping the driver not to lock up thewheels when braking

AGV Autonomous Guided Vehicle

apt Program for package management under Debian Linux, seewww.debian.org.

BLAS Basic Linear Algebra Subprograms

DARPA Defense Advanced Research Projects Agency

DGPS Differential GPS

EGNOS Euro Geostationary Navigation Overlay Service

EPS Electronic Stability Program - helping the driver to recoverfrom uncontrolled vehicle movement

google.dk Internet search engine - www.google.dk

GPS Global Positioning System

GSL Gnu Scientific Library

IAU Ørsted•DTU, Automation

IMU Intertial Measurement Unit - contains accelerometers and gy-ros in one package to detect changes in altitude, location andmotion.

INS Inertial Navigation System - contains an IMU and computersystem to handle informations and guide a vehicle.

IR Infra Red

KMS.dk Kort og Matrikelstyrelsen (National Survey and Cadastre)

85

Page 100: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

11. Future work 11.5. MMR

KVL Den Kgl. Veterinær- og Landbohøjskole (The Royal Veterinaryand Agricultural University, Denmark)

Lat/Lon Latitude/Longitude

Linux Open Source operating system widely used in the academiccircles.

MEML Meta Matrix Library

MMR Medium Mobile Robot

MTL The Matrix Template Library

OpenCV Open Computer Vision Library

orocos.org Realtime robotics and control homepage by HermanBruyninckx, Katholieke Universiteit Lueven, Belgium -www.orocos.org

RTK-GPS Real Time Kinematic GPS

SBAS Satellite Based Augmentation System. The common term forEGNOS and WAAS. The project is originally for aviation pur-poses, where the satellite reception of the SBAS correction sig-nal is better.

SF Sensor Fused

UTM Universal Transverse Mercator

WAAS Wide Area Augmentation System

XML eXtensible Markup Language

Filter

uk Control signal vector at time k

vm Measurement noise vector

vp State noise vector

xk State vector at time k

yk Output vector at time k

xk Estimate of the state vector

Ck Output matrix

Dk Input feed through matrix

86

Page 101: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

11. Future work 11.5. MMR

Fk System dynamic matrix

Gk Input matrix

Gvm Measurement noise input matrix

Gvp State noise input matrix

Qk Error covariance matrix.

Qm Short hand notation for measurement noise matrix - Qm =GvmRmGT

vm

Rm Measurement noise covariance matrix

Rp State noise covariance matrix

Model

xk State vector of the vehicle at time k

yk Output vector of the vehicle at time k

θ Direction of the vehicle rad

d Driven distance of the vehicle m

dwl Left wheel diameter for the differential model. m

dwr Right wheel diameter for the differential model. m

L Distance between the axles of the Ackerman vehicle. m

x x-coordinate of the vehicle m

y y-coordinate of the vehicle m

87

Page 102: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1
Page 103: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Bibliography

[Andersen & Ravn, 2004] Andersen, N. & Ravn, O. (2004). SMR-CL, areal-time control language for mobile robots. In 2004 CIGR InternationalConference Beijing, China.

[Bar-Shalom et al., 2001] Bar-Shalom, Y., Li, X.-R., & Kirubarajan, T.(2001). Estimation with Applications to Tracking and Navigation. JohnWiley and Sons, Inc.

[Blanke, 2003] Blanke, M. (2003). Fault-tolerant and diagnostic methods fornavigation. Paper at, International Cooperation on Marine EngineeringSystems.

[BLAS, 2006] BLAS, c. (2006). Blas technical forum. 2006,http://www.netlib.org/blas/blast-forum/.

[Blas & Riisgaard, 2005] Blas, M. R. & Riisgaard, S. (2005). Autonomousoutdoor navigation. Master’s thesis, Technical University of Denmark,Ørsted•DTU, Section of Automation, Technical University of Denmark,Elektrovej building 326, 2800 Kgs. Lyngby, Denmark.

[Borenstein et al., 1996] Borenstein, J., Everett, H. R., & Feng, L. (1996).Where am i? sensors and methods for mobile robot positioning. Internet,http://www-personal.umich.edu/ johannb/shared/pos96rep.pdf.

[Breitling & Nielsen, 2004] Breitling, M. & Nielsen, A. (2004). Design ogimplementation af udendørs mobil robot. Master’s thesis, Technical Uni-versity of Denmark, Ørsted•DTU, Section of Automation, Technical Uni-versity of Denmark, Elektrovej building 326, 2800 Kgs. Lyngby, Denmark.

[Caltabiano et al., 2004] Caltabiano, D., Muscato, G., & Russo, F. (2004).Localization and self calibration of a robot for volcano. IEEE Interna-tional Conference on Robotics and Automation, 586-591 Vol.1, 586–591.

[Dudek & Jenkin, 2000] Dudek, G. & Jenkin, M. (2000). ComputationalPrinciples of Mobile Robotics, volume 1. Cambridge University Press, 1stedition.

89

Page 104: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

BIBLIOGRAPHY BIBLIOGRAPHY

[ESA, 2006] ESA (2006). Egnos for professionals. Internet,http://esamultimedia.esa.int/docs/egnos/estb/egnos pro.htm.

[Frochte, 2006] Frochte, J. (2006). Documentation of the meta matrix li-brary. Internet, http://www.meml.smial.de/doc/index.html.

[GARMIN, 2000] GARMIN (2000). Gps guide for beginners. Internet,http://www.garmin.com/aboutGPS/manual.html.

[Gottschling, 2006] Gottschling, P. (2006). The matrix template library.Internet, http://www.osl.iu.edu/research/mtl/.

[GSL, 2006] GSL, c. (2006). Gsl - gnu scientific library. Internet,http://www.gnu.org/software/gsl/.

[Hague et al., 2000] Hague, T., Marchant, J., & Tillett, N. (2000). Groundbased sensing systems for autonomous agricultural vehicles. Computersand Electronics in Agriculture, 25, 11–28.

[Hansen & Monrad, 2005] Hansen, O. F. & Monrad, M. (2005). Cooperativemobile robots. Report, Technical University of Denmark, Ørsted•DTU,Section of Automation, Technical University of Denmark, Elektrovejbuilding 326, 2800 Kgs. Lyngby, Denmark.

[Hendricks et al., 2003] Hendricks, E., Jannerup, O., & Sørensen, P. H.(2003). Linear Systems Control, Deterministic and stochastic methods.Ørsted•DTU, Section of Automation, Technical University of Denmark.

[Holmgaard, 2004] Holmgaard, N. (2004). Sensors and control for anunmanned aerial vehicle. Report, Technical University of Denmark,Ørsted•DTU, Section of Automation, Technical University of Denmark,Elektrovej building 326, 2800 Kgs. Lyngby, Denmark.

[Kohne & Woßner, 2005] Kohne, A. & Woßner, M. (2005). Gps explained.Internet, http://www.kowoma.de/en/gps/index.htm.

[Larsen, 2001] Larsen, M. B. (2001). Autonomous Navigation of UnderwaterVehicle. PhD thesis, Technical University of Denmark, Ørsted•DTU, Sec-tion of Automation, Technical University of Denmark, Elektrovej building326, 2800 Kgs. Lyngby, Denmark.

[Larsen, 1998] Larsen, T. D. (1998). Optimal Fusion of Sensors. PhD thesis,Technical University of Denmark, Ørsted•DTU, Section of Automation,Technical University of Denmark, Elektrovej building 326, 2800 Kgs. Lyn-gby, Denmark.

[Mogensen, 2006] Mogensen, L. (2006). Kalmtool v.4.2. Internal paper.

90

Page 105: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

BIBLIOGRAPHY BIBLIOGRAPHY

[Nielsen & Mejnertsen, 2004] Nielsen, A. R. & Mejnertsen, A. (2004).Ackermanstyret Robocup deltager. Specialkursus rapport, Automation,Ørsted•DTU, Ørsted•DTU, Section of Automation, Technical Universityof Denmark, Elektrovej building 326, 2800 Kgs. Lyngby, Denmark.

[Nielsen & Mejnertsen, 2006] Nielsen, A. R. & Mejnertsen, A. (2006). Hakocontrol system. Master’s thesis, Technical University of Denmark,Ørsted•DTU, Section of Automation, Technical University of Denmark,Elektrovej building 326, 2800 Kgs. Lyngby, Denmark.

[O’Connor et al., 1996] O’Connor, M., Bell, T., & Elkaim, G. (1996). Au-tomatic steering of farm vehicles using gps. Stanford University.

[OpenCV, 2006] OpenCV, c. (2006). Opencv introduction. Internet,http://opencvlibrary.sourceforge.net/Welcome.

[Pathirana & Savkin, 2003] Pathirana, P. & Savkin, A. (2003). Sensor fu-sion based missile guidance. Proceedings of the Sixth International Con-ference of Information Fusion, 1, 253–260.

[Ravn et al., 2006] Ravn, O., Torp, S., Nørgard, M.,& Pjetursson, A. (2006). iau mat. Internet,http://www.iau.dtu.dk/AG/Software/Matlibs/iau mat.html.

[Sasiadek & Hartana, 2004] Sasiadek, J. & Hartana, P. (2004). Sensor fu-sion for navigation of an autonomous unmanned aerial vehicle. Proceedingsof the 2004 IEEE International Conference on Robotics and Automation,(pp. 4029 – 4034).

[Sejerøe, 2004] Sejerøe, T. H. (2004). An estimaiton framework for nonlinearsystems. Master’s thesis, Technical University of Denmark, Ørsted•DTU,Section of Automation, Technical University of Denmark, Elektrovejbuilding 326, 2800 Kgs. Lyngby, Denmark.

[Tur et al., 2005] Tur, J. M. M., Gordillo, J. L., & Borja, C. A. (2005). Aclosed-form expression for the uncertainty in odometry position estimateof an autonomous vehicle. IEEE Transactions on robotics, 21(5), 1017 –1022.

[Zogg, 2002] Zogg, J.-M. (2002). Gps basics, introduction to the system,application overview. Internet, www.u-blox.com.

91

Page 106: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1
Page 107: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

List of Figures

1.1 RoboCleaner from Karcher. . . . . . . . . . . . . . . . . . . . 2

1.2 Straddle carrier from the port of Brisbane. . . . . . . . . . . . 2

2.1 Medium Mobile Robot. . . . . . . . . . . . . . . . . . . . . . 9

2.2 Hako tractor. . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

2.3 Small Mobile robot. . . . . . . . . . . . . . . . . . . . . . . . 10

4.1 Structure of the connections between the project components. 18

4.2 XML example showing tags, attribute and nested structure. . 19

4.3 Doxygen documentation example for the IAUmat real timematrix library, matrix multiplication function. . . . . . . . . . 21

4.4 Screen shot of the on line documentation generated from theDoxygen code depicted in figure 4.3. . . . . . . . . . . . . . . 22

5.1 Test of GPS with EGNOS disabled. . . . . . . . . . . . . . . 26

5.2 Drifting odometry due to a 0.57% error in the odometric cal-ibration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

5.3 Absolute and relative coordinate system for a vehicle. . . . . 28

5.4 Ackerman model in discrete time. . . . . . . . . . . . . . . . . 30

5.5 Differential model in discrete time. . . . . . . . . . . . . . . . 32

6.1 Difference in the estimate of position, using linear and non-linear prediction method. . . . . . . . . . . . . . . . . . . . . 40

6.2 Difference in the estimate of angle, using linear and nonlinearprediction method. . . . . . . . . . . . . . . . . . . . . . . . . 40

7.1 Structure of the connections between the project components. 44

7.2 Structure of old and new SMRdemo program. . . . . . . . . . 45

7.3 Structure of sensor fusion library. . . . . . . . . . . . . . . . . 46

7.4 Configuration example for the odometric Kalman filter -HAKO tractor. . . . . . . . . . . . . . . . . . . . . . . . . . . 47

7.5 Structure of robot simulator. . . . . . . . . . . . . . . . . . . 49

7.6 Simulator configuration example for the GPS. . . . . . . . . . 49

93

Page 108: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

LIST OF FIGURES LIST OF FIGURES

7.7 Structure of GPS server and client. . . . . . . . . . . . . . . . 51

7.8 GPS server configuration example. . . . . . . . . . . . . . . . 52

7.9 GPS server configuration example. . . . . . . . . . . . . . . . 53

8.1 Simulation of HAKO running straight run to check for con-vergence of the filter - Position plot. . . . . . . . . . . . . . . 56

8.2 Simulation of HAKO running straight run to check for con-vergence of the filter - Heading plot. . . . . . . . . . . . . . . 57

8.3 Simulation of HAKO running fishtail maneuver - Positionplot. The result is visually identical to the one found in[Nielsen & Mejnertsen, 2006, p. 68]. . . . . . . . . . . . . . . 58

8.4 Simulation of HAKO waiting between two runs. . . . . . . . . 59

8.5 Simulation of MMR running straight line to check for conver-gence of the filter - Position plot. . . . . . . . . . . . . . . . . 60

8.6 Simulation of MMR running straight line to check for conver-gence of the filter - Heading plot. . . . . . . . . . . . . . . . . 61

8.7 Simulation of MMR running straight line with no noise andbest simulated filter weights - Position plot. . . . . . . . . . . 62

8.8 Simulation of MMR running straight line with no noise andbest simulated filter weights - Heading plot. . . . . . . . . . . 63

8.9 Simulation of MMR running 10 m x 10 m square - Positionplot. The figure shows GPS measurements, MMR internal po-sition, Compensated MMR position and Sensor Fused head-ing. The high value of the covariance matrix at start makesthe Kalman estimate converge fast as can be seen from thefigure. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64

8.10 Simulation of MMR running 10 m x 10 m square - Angleplot. The figure shows MMR internal angle and Sensor Fusedheading. The reason why the two are not identical is a π/2difference in initial heading, and as can be seen it stays thatway. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65

8.11 Real life test of MMR stationary in parking lot - Positionplot. The figure shows the drift of the GPS. . . . . . . . . . . 66

8.12 Real life test of MMR stationary in parking lot - Headingplot. The vehicle heading is drifting as expected, which isalso picked up by the Kalman filter. . . . . . . . . . . . . . . 67

8.13 Real life test of MMR driving in parking lot - Position plot. . 68

8.14 Real life test of MMR driving in parking lot - Heading plot. . 69

8.15 MMR real life test in Dyrehaven - Position plot. . . . . . . . 70

8.16 MMR real life test in Dyrehaven - Heading plot. Final vari-ance is σ = 0.1 rad . . . . . . . . . . . . . . . . . . . . . . . . 71

8.17 MMR real life test with tuned parameters - Position plot. . . 72

8.18 MMR real life test with tuned parameters - Heading plot.Final variance of the angle is σ = 0.075 rad . . . . . . . . . . 73

94

Page 109: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

LIST OF FIGURES LIST OF FIGURES

9.1 Illustration of the heading information contained in the pro-gression of the position. . . . . . . . . . . . . . . . . . . . . . 78

C.1 Simulation of HAKO running turn test maneuver - Positionplot. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104

C.2 Simulation of HAKO running row skip maneuver - Positionplot. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105

C.3 Real life test of MMR 90 m run in parking lot - Position plot. 106C.4 Real life test of MMR 90 m run in parking lot - Heading plot. 107C.5 Real life test of MMR 102 m run in parking lot - Position plot.108C.6 Real life test of MMR 102 m run in parking lot - Heading plot.109C.7 Real life test of MMR in Dyrehaven 1st run - Position plot. . 110C.8 Real life test of MMR in Dyrehaven 1st run - Heading plot. . 111C.9 Real life test of MMR in Dyrehaven 2nd run - Position plot. . 112C.10 Real life test of MMR in Dyrehaven 2nd run - Heading plot. . 113C.11 Real life test of MMR in Dyrehaven 3rd run - Position plot. . 114C.12 Real life test of MMR in Dyrehaven 3rd run - Heading plot. . 115C.13 Real life test of MMR in Dyrehaven all three runs - Heading

plot. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116

D.1 Cartesian coordinate system used in describing the physicalrepresentation of the robot. The coordinate frame is used inboth the description of the perimeter and sensor positions. . . 118

E.1 Robot main loop routine - New. . . . . . . . . . . . . . . . . . 137E.2 smrd controlled main loop routine. . . . . . . . . . . . . . . . 138E.3 HAKO main loop routine. . . . . . . . . . . . . . . . . . . . . 138E.4 SMRdemo start routine. . . . . . . . . . . . . . . . . . . . . . 139E.5 SMRdemo init routine. . . . . . . . . . . . . . . . . . . . . . . 140E.6 SMRdemo main loop. . . . . . . . . . . . . . . . . . . . . . . 141

F.1 Ackerman model in continuous time. . . . . . . . . . . . . . . 143F.2 Ackerman model in discrete time. . . . . . . . . . . . . . . . . 145

G.1 Differential model in continuous time. . . . . . . . . . . . . . 151G.2 Differential model in discrete time. . . . . . . . . . . . . . . . 153

H.1 GPS client for debug and GPS measurement collection. . . . 166H.2 Test of GPS with EGNOS disabled. . . . . . . . . . . . . . . 167H.3 Test of GPS with EGNOS enabled. . . . . . . . . . . . . . . . 167H.4 Test of GPS, comparing of the results of the tests. . . . . . . 168

95

Page 110: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1
Page 111: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

List of Tables

5.1 GPS precision . . . . . . . . . . . . . . . . . . . . . . . . . . . 265.2 GPS precision in simulator. . . . . . . . . . . . . . . . . . . . 275.3 Input noise - Ackerman model. . . . . . . . . . . . . . . . . . 315.4 Measurement noise - Ackerman model. . . . . . . . . . . . . . 315.5 Input noise from quantization - differential model. . . . . . . 345.6 Input noise used in calculations - differential model. . . . . . 355.7 Measurement noise - differential model. . . . . . . . . . . . . 35

7.1 Matrix library study - candidates . . . . . . . . . . . . . . . . 47

8.1 Tuned filter design parameters. . . . . . . . . . . . . . . . . . 73

9.1 Doxygen documented software, developed / edited in thisproject. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77

D.1 Robots available in the simulator. . . . . . . . . . . . . . . . . 119

E.1 Matrix library study - candidates . . . . . . . . . . . . . . . . 123E.2 Matrix library study - technical perspective . . . . . . . . . . 123E.3 Matrix library study - subjective evaluation . . . . . . . . . . 123E.4 Flowchart color code . . . . . . . . . . . . . . . . . . . . . . . 136

G.1 IR distance sensor position . . . . . . . . . . . . . . . . . . . 155

H.1 GPS noise recorded in the initial phase of the project. . . . . 166

97

Page 112: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1
Page 113: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Appendix A

Contributions by this Project

This appendix is a short introduction to the contributions made by thisproject.

• Sensor Fusion tool for both differentially steered and Ackerman steeredvehicles, based on open form Kalman filtering. The filter noise initial-ization and execution handling is done by XML file.

• Reconstruction of the robot control program main loop, to have abetter structure for implementing the sensor fusion algorithms. Con-figuration of robot parameters ported to valid XML format.

• Simulator testing and development of a GPS device for general use.The differentially steered vehicles has been thoroughly tested andconfiguration errors corrected. Communication bugs has been docu-mented, but a solution has not been found. A workaround is currentlydeployed, based on flushing communication buffers.

• Testing and documenting IAUmat, the real time matrix library de-veloped at Automation. Not all documented shortcomings has beencorrected, see the Related pages - todo list in the documentation on theCD and the bug list in appendix B for more details. Documentationof the library is very thorough, making the library more user friendly.

• Real life tests show the same result as the simulation. The perfor-mance of the consumer grade GPS is not sufficient for stable and nondrifting positioning. Performance of the simple filter can be tuned foreither stationary or mobile performance, not taking the GPS drift intoaccount. Doing stable and reproducible positioning for mobile robotsis dependent on the use of RTK-GPS.

99

Page 114: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1
Page 115: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Appendix B

Bug List

The list shows the most critical errors detected.

B.1 SMRdemo

• The use of several calibration files with no comments or documentationshould be avoided. The initialization routine reinitializes values set inthe default configuration file robot.conf with a hard coded value thatis not correct if the extra calibration file is faulty. The faults stem fromdifferent version of the robot control software. This bug has not beencorrected, but a warning is now given.

• Communication library between the GPS client/server is faulty.Odometry errors has been reported by the robot control software, pos-sibly originating from the extra communication. The error does notaffect the operation of the SMR, but when laser scanner or GPS isused communication errors occur. The tests to reveal this error hasprimarily been carried out with the simulator, but the same tendencyhas been seen when doing real life tests.

B.2 IAUmat

• The toolbox is not completely real time safe. The implemented ma-trix inversion routines uses allocation and deallocation of memory. Asolution to this problem would be much appreciated.

• Some file read/write functions are not correct. Reading the functionsand variables used does not give a solution, so solving this will requiresome reverse engineering.

101

Page 116: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

B. Bug List B.3. Kalman Filter

B.3 Kalman Filter

• Currently an offset parameter is to be set in the filter configurationto reduce the numerical value of the GPS measurement. If not usedthe filter will diverge due to overflow in the internal representation.This could be done automatically with the first absolute position re-ceived, but when testing the algorithm and changing between differentenvironments like simulator and real life, having the offset visible andunder control is more important than ease of use.

B.4 Simulator

• The calibration value for the difference between the wheel diametersis read and stored, but not used for calculations.

• The asynchronous socket communication between the simulator andSMRdemo does not work on the server Nyquist. Running the samesetup on Jensen works fine. This is not optimal since the developmentserver is Nyquist and Nyquist is the newest installation. Other serversdoes not contain the correct libraries to run the compilation of therobot control software.

• The asynchronous communication demands that both the asynchro-nous socket and normal socket is cleared for additional messages everytime a new message is received. If this is not done the buffers overflowand cause the communication to hang. It has not been investigatedhow this affects the messaged sent, but the whole idea of throwingaway information is of no use to a serious simulator. This means anew way of connecting the asynchronous sensors must be developed.

• GPS offset according to the simulator internal representation is betterplaced in the map definition than in the GPS receiver definition.

B.5 GPS client/server

• Inconsistencies in program execution has been detected when movingthe program between different platforms. Programs that are runningon a Laptop with Linux kernel version 2.6 and servers with Linux ker-nel version 2.6 needs modifications, before running on the departmentrobots with Linux kernel version 2.4. Upgrading the robots and mak-ing sure the Linux system is compatible on all computers would makedevelopment easier and faster.

102

Page 117: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Appendix C

Test Results

This appendix is dedicated to the tests and results that is relevant for theevaluation of the work, but not crucial for showing the performance of thetoolbox.

The plots of the results obtained in real life has been edited to givea more insight into the results.

The internal position and heading of the vehicles has been compen-sated, to have the same starting point and heading as the absolute positionobserved when doing the tests.

Not having an absolute true position of the vehicle in the test makesassessment of the performance problematic. An absolute true positionbased on visual impressions, has been added to the plots.

The reason for doing this is to have a better illustration of the con-ditions the estimator is working under.

103

Page 118: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

C. Test Results C.1. HAKO Simulation Results

C.1 HAKO Simulation Results

Extra simulation of a turn-test to show that the algorithm workvery well with the HAKO tractor. Simulations can be compared to[Nielsen & Mejnertsen, 2006, p. 66].

−10 0 10 20 300

5

10

15

20

25

30

35

40

45HAKO with RTK−GPS and odometric filter − turn test

World Easting [m] +708090 (UTM32)

Wor

ld N

orth

ing

[m]

+61

7417

0 (U

TM

32)

GPS measurementsSF GPS and odometry

Figure C.1: Simulation of HAKO running turn test maneuver -Position plot.

104

Page 119: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

C. Test Results C.1. HAKO Simulation Results

Extra simulation of a row skip maneuver, to show that the algorithmwork very well with the HAKO tractor. Simulations can be compared to[Nielsen & Mejnertsen, 2006, p. 66].

−10 −5 0 5 10 15 20 25 30 355

10

15

20

25

30

35

40

45HAKO with RTK−GPS and odometric filter − row skip

World Easting [m] +708090 (UTM32)

Wor

ld N

orth

ing

[m]

+61

7417

0 (U

TM

32)

GPS measurementsSF GPS and odometry

Figure C.2: Simulation of HAKO running row skip maneuver -Position plot.

105

Page 120: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

C. Test Results C.2. MMR Real Life Results

C.2 MMR Real Life Results

The extra MMR test results that are good for the evaluation of the filter,but not necessary in the report to get an understanding of the performance.

C.2.1 Parking lot

Several tests has been carried out, but only the three most important andillustrative has been chosen for this appendix.

−60 −40 −20 0 20 40 6050

60

70

80

90

100

110

120

130

MMR with GPS and odometric filter long run, parking lot

World Easting [m] +720400 (UTM32)

Wor

ld N

orth

ing

[m]

+61

8750

0 (

UT

M32

)

GPS measurementsSensor fused positionMMR compensated positionMMR observed positionkms.dk parkinglot

Figure C.3: Real life test of MMR 90 m run in parking lot - Positionplot.

This plot shows that if the GPS position drifts too fast, the filter is notcapable of keeping up. The vehicle should not try this hard to keep up withthe GPS. From the true position it is obvious, that the filter should putmore trust in the odometry to estimate a better heading. However this isnot so simple, since the odometry has a rather poor performance in the testsdone. Improvement of the results will depend on better odometry.

106

Page 121: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

C. Test Results C.2. MMR Real Life Results

0 20 40 60 80 100

−3

−2

−1

0

1

2

3

4

5

MMR with GPS and odometric filter long run, parking lot

Time [s]

Veh

icle

hea

ding

[rad

]Sensor fused angleMMR compensated angleMMR observed angle

Figure C.4: Real life test of MMR 90 m run in parking lot - Headingplot.

The heading is good at first, but is starts to oscillate when the GPSmeasurement starts to drift. The filter does converge like the simulationindicates, but it is too slow. The plot indicate a convergence to the trueobserved heading, but in this case it is more likely to be a coincidence. Thetrue heading has a tendency to drift to the right. A 30 deg/min drift hasbeen observed, but it has not been identified early enough in the process toinvestigate further.

107

Page 122: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

C. Test Results C.2. MMR Real Life Results

−60 −40 −20 0 20 40 6050

60

70

80

90

100

110

120

130

MMR with GPS and odometric filter fwd 102 m, parking lot

World Easting [m] +720400 (UTM32)

Wor

ld N

orth

ing

[m]

+61

8750

0 (

UT

M32

)

GPS measurementsSensor fused positionMMR compensated positionMMR observed positionkms.dk parking lot

Figure C.5: Real life test of MMR 102 m run in parking lot - Posi-tion plot.

The test run is started with a heading in opposite direction. This throwsthe filter totally off the balance and it cannot recover. Tuning of the filter orextra sensory information is needed to correct this poor performance. Notethe large 15 m jump of the GPS position at the very start of the test run.This is common when running the MMR in the parking lot, making the testruns very difficult to repeat with the same outcome. The GPS position doeshowever have the same trajectory shape as the true position, but only afterrunning 20 m and with an offset.

108

Page 123: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

C. Test Results C.2. MMR Real Life Results

0 20 40 60 80 100

−3

−2

−1

0

1

2

3

4

5

MMR with GPS and odometric filter fwd 102 m, parking lot

Time [s]

Veh

icle

hea

ding

[rad

]Sensor fused angleMMR internal angleMMR observed angle

Figure C.6: Real life test of MMR 102 m run in parking lot - Head-ing plot.

The direction is initialized in the opposite direction, and as can be seenit does not recover. Again, a drift on the odometry is visible which hasnot been investigated further. Not much more information can be extractedfrom this plot.

109

Page 124: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

C. Test Results C.2. MMR Real Life Results

C.2.2 Dyrehaven

The test in Dyrehaven consists of three separate runs, each after a restartof the robot control software. The initial stationary period shows a noise

50 100 150 200 250 300 350−150

−100

−50

0

50

100

150

MMR with GPS and odometric filter 1st run, 283m

World Easting [m] +722150 (UTM32)

Wor

ld N

orth

ing

[m]

+61

8930

0 (

UT

M32

)

GPS measurementsSensor fused positionMMR compensated positionkms.dk road 6m

Figure C.7: Real life test of MMR in Dyrehaven 1st run - Positionplot.

on the GPS of σGPS northing = 2m and σGPS easting = 1.2m. These figuresare much better that the result from the parking lot tests. The positionconverges nicely from 50 to 150 m world easting, but still oscillates. From150 to 320 m world easting the position estimate oscillations get bigger everthough the GPS position characteristic does not change. The deteriorationof the estimate is due to the changes in the internal position. The loopmade by the odometry starts out with a diameter of approximately 130 mbut during the run it narrows in to 75 m. This inconsistency with themodel, that does not account for a model inaccuracy, is the main reason forthe deterioration of the estimate.

110

Page 125: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

C. Test Results C.2. MMR Real Life Results

0 100 200 300 400 500

−3

−2

−1

0

1

2

3

4

MMR with GPS and odometric filter 1st run, 283m

Time [s]

Veh

icle

hea

ding

[rad

]Sensor fused angleMMR internal anglekms.dk road angle

Figure C.8: Real life test of MMR in Dyrehaven 1st run - Headingplot.

The heading estimate shows the same tendency as the position plot. Theestimate converges until about 1/3 into the run, showing oscillations on theestimate, but no angle offset. The estimate gets worse the last 2/3 of therun. The estimate oscillate with a bigger variance and now with a mostundesirable offset. The drift in relative heading is also visible in this run,with a speed of 72 deg/min. This is far from optimal, and a problem withthe gyro is the only thing that can account for a drift that large.

111

Page 126: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

C. Test Results C.2. MMR Real Life Results

0 100 200 300 400 500−250

−200

−150

−100

−50

0

50

100

150

200

250

MMR with GPS and odometric filter 2rd run, 472m

World Easting [m] +722400 (UTM32)

Wor

ld N

orth

ing

[m]

+61

8920

0 (

UT

M32

)

GPS measurementsSensor fused positionMMR compensated positionkms.dk road 6m

Figure C.9: Real life test of MMR in Dyrehaven 2nd run - Positionplot.

The position convergence is very satisfactory, when the vehicles is sta-tionary at the beginning. The position estimate does however oscillate heav-ily due to drift of the heading when stationary, see figure C.10. The filterdoes not recover from the heavy oscillation. The main reason is the choiceof variance in the design of the Kalman filter, but the odometry imperfec-tion does not help either. The internal position from the odometry show aninternal position with a diameter of 120 m makes the final oscillation on theposition much smaller, than the first run where the diameter was as smallas 75 m.

112

Page 127: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

C. Test Results C.2. MMR Real Life Results

0 100 200 300 400 500 600 700 800 900

−3

−2

−1

0

1

2

3

4

MMR with GPS and odometric filter 2rd run, 472m

Time [s]

Veh

icle

hea

ding

[rad

]Sensor fused angleMMR internal anglekms.dk road angle

Figure C.10: Real life test of MMR in Dyrehaven 2nd run - Headingplot.

The drift in heading, due to stationary estimation at the start of therun (100 s to 300 s), does not make the test run ideal. The estimate of theheading oscillates heavily at the beginning, but converges to a the correcttrue heading, showing low variance. Again, when the vehicle is stopped atthe end, the estimate of the heading diverge due to estimation when thevehicle is stationary (780 s to 930 s). The stationary heading is showing asmall offset (650 s to 780 s) but it will not be commented upon, since theestimated position is so far off.

113

Page 128: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

C. Test Results C.2. MMR Real Life Results

0 50 100 150 200 250 300 350 400−150

−100

−50

0

50

100

150

MMR with GPS and odometric filter 3rd run, 303m

World Easting [m] +722900 (UTM32)

Wor

ld N

orth

ing

[m]

+61

8920

0 (

UT

M32

)

GPS measurementsSensor fused positionMMR compensated positionkms.dk road 6m

Figure C.11: Real life test of MMR in Dyrehaven 3rd run - Positionplot.

This plot has been commented on in the main report chapter 8. Thetest run is very similar to the first test run page 111, showing the samedegradation in the estimate when the internal position based on odometrydegrades.

114

Page 129: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

C. Test Results C.2. MMR Real Life Results

0 50 100 150 200 250 300 350 400 450

−3

−2

−1

0

1

2

3

4

MMR with GPS and odometric filter 3rd run, 303m

Time [s]

Veh

icle

hea

ding

[rad

]Sensor fused angleMMR internal anglekms.dk road angle

Figure C.12: Real life test of MMR in Dyrehaven 3rd run - Headingplot.

This plot has been commented on in the main report chapter 8. No-tice the problem with following the road correctly at 330 s. The problemoccurred several times when driving but this is the most visible.

115

Page 130: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

C. Test Results C.2. MMR Real Life Results

−200 0 200 400 600 800−500

−400

−300

−200

−100

0

100

200

300

400

500

MMR with GPS and odometric filter Combined run

World Easting [m] +722400 (UTM32)

Wor

ld N

orth

ing

[m]

+61

8920

0 (

UT

M32

)

GPS measurementsSensor fused positionMMR compensated position

Figure C.13: Real life test of MMR in Dyrehaven all three runs -Heading plot.

Combination plot to give an idea of the test runs from the gate at’Hjortekær’ to ’Erimitage Slottet’.

116

Page 131: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Appendix D

Simulator

This chapter is a summary of the changes made to the simulator in or-der to track the changes. The secondary objective is to document theimplementation and functionality of the simulator. The reader is as-sumed to have read the original implementation report on the simulator,[Nielsen & Mejnertsen, 2006].

D.1 Vehicles

The changes in the files are shortly described here. The robots are definedas points in a Cartesian coordinate frame. The center of the coordinatesystem is the odometric center of the robot, when looking at the robotsimplemented in the simulator, see figure D.1.

SMR The model of the robot has been updated and the IR sensor are nowcorrectly placed, changes in SMR.xml.

MMR The simulator configuration of the MMR is collected from the realrobot, placed in MMR.xml and a test directory is created in the sim-ulator. As with the other robots the configuration files and calibrationof the robot is placed in a directory named like the robot /MMR.

RangeFinder.java Changed the enumeration enum to enums, becauseenum has become a protected word in java 1.5.

CollisionDetection.java Changed the enumeration enum to enums, be-cause enum has become a protected word in java 1.5.

World.java Changed the enumeration enum to enums, because enumhas become a protected word in java 1.5.

SensorServer.java Removed a print statement.

117

Page 132: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

D. Simulator D.1. Vehicles

posθ

y

x

21 3

0 4

5

Figure D.1: Cartesian coordinate system used in describing thephysical representation of the robot. The coordinate frame is usedin both the description of the perimeter and sensor positions.

118

Page 133: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

D. Simulator D.2. GpsMouse class

User Interface Adding the current sample time used to the display. Thisis important since the robots are running different sample times andif not set correct the simulation will not show the correct situation.

The connection to the Simulator is what tells it the robot type. Currentlythere are the following robots available in server.xml.:

Robot Port Sample time

SMR 0 + 1 10msAckerbot 30 10msMMR 50 10msHAKO 60 100msLasertest 70 10msgpstest 80 10ms

Table D.1: Robots available in the simulator.

The robots verified and tested in the simulation are the SMR and MMR.The simulations for the sensor fusion project has been carried out with theHAKO and gpstest, where the gpstest robot is a modified MMR with theadded consumer grade GPS.

D.2 GpsMouse class

To make simulations possible a new GPS sensor has been added to thesimulator. The class is build upon the UTMgps and LaserScanner classes.The reason for doing this is the reuse of code which should save time in thedevelopment process, leaving more time for the main topic of the thesis,sensor fusion and sensor evaluation.

This also means that the GpsMouse hasS inherited some not so intu-itive features like the GPS UTM offset according to the map is defined inthe GPS sensor and not in the map definition.

The noise implementation is inherited from the UTMGps sensor andthis means that the noise is Gaussian ”normally” distributed randomdouble number with zero mean and a standard deviation of 1. This meansthat the information from the GPS vendors will have to be calculated intoa standard deviation and not a 95% confidence interval.

At the moment the angle calculated from the GPS is not implemented inthe answer given by the GPS sensor.

119

Page 134: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

D. Simulator D.3. Expansions

D.3 Expansions

The simulator has problems running real time. The entire structure isbased on timing from the Linux operating system using time calls throughJava. This has proven to be very unreliable showing different sample timesand communication problems when running on different systems, in thiscase Laptop and Server.

The data placement is currently not as well structured as is shouldbe. The offset that tells the simulator where the GPS unit is situatedaccording to the left bottom corner is defined in the GPS device and notin the definition of the environment where it is more logical. Currentlychanging from one environment to another requires change in different files,one for the simulator environments and one for each robot.

Other robots than the ones used with the smrd-structure are cur-rently implemented. This poses new interesting problems like differentsample times, which is not supported by the current simulator. The sampletime is not defined for the individual robots, and because of that only onetype can be simulated at the time. Again the change from one robot toanother is not done simply by logging on the server as originally intended,but by closing the server down and changing the sampling time in theconfiguration file before restarting the simulator.

120

Page 135: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Appendix E

SMRdemo

This appendix is for the information gathered during the project, that con-cerns the SMRdemo robot control program. One section is dedicated to thereal time matrix library study.

Robot Control Software

The robot control program consists of methods to handle sensory input anddifferent algorithms to maneuver the vehicles. On top of this an interpreter isadded that interprets SMR-CL [Andersen & Ravn, 2004] commands. Thesecommands enables the easy programming of tactical maneuvers directly ormake strategic decisions on a higher lever, and then feed the informationto the program via a socket connection. All sensory information can becollected from the platform via commands, which enables specialized appli-cations to be constructed on top. Maneuvering the vehicles is made simpleby a wide selection of different commands like virtual line1 following andodometric maneuvers. These commands are available for all robots, butspecialized features also exist. For the SMR wall following and followinglines of tape on the floor are two examples.

E.1 Matrix Library Study

The objective of this section is to get the fastest, leanest and most realtime compatible matrix library for the matrix operations in the Kalmanfilter. If there are Kalman filter routines implemented they are also checked.

The study of available libraries is done using Google.dk and a realtime robotics / real time control homepage orocos.org.

1Imaginary line in the robots own impression of the world.

121

Page 136: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

E. SMRdemo E.1. Matrix Library Study

Since this survey is aimed at bringing matrix and vector calculationsto the institute toolbox, it is important that it lives up to the standards ofreal time computing. The process of choosing the right library runs in twostages. First the technical part is dealt with and secondly the subjectiveevaluation of the user friendliness. In collaboration with the supervisorsthe following criteria are put up.

• The size must be at a minimum, not to cause too much hazel whencompiling and implementing solutions.

• The matrix solution should contain add, multiply, divide, scale, trans-pose and invert.

• If expansions are supported it is most welcome. At the moment theuse is limited to filtering and control, but multi body simulation andcomputer vision are in the areas of interest.

• The toolbox should at the moment support C since it is the platformfor the institute robots software. New additions has been made inC++ and this is also important for the toolbox selection.

• The main operating system is Linux, but support for Windows is alsoimportant for keeping possibilities open.

• The library must have extern support and be a mature product. Bymature is meant that there must be several developers and multipleusers.

• User friendly interface to the system in order for new users to acceptthe solution and not use another alternative.

Five alternatives has been chosen for comparison based on recommendations,rank on search engines (indicating use) and technical specifications. As partof some of the solutions is a numerical solver and linear algebra packagenamed BLAS. This package is interesting for future use and will also be awelcome addition to the future matrix library.

The result of the survey is listed in table E.2 and table E.3 to ease thecomparison. The most interesting candidates in the two tables are com-mented and the best solution is recommended at the end.

Looking at table E.2 OpenCV and GSL are the most interesting.OpenCV has the matrix functions and Kalman filter, but lacks the ex-pandability by other modules. GSL has more possibilities when it comesto expansions, and the Kalman filter has been implemented by others. Thereason why most of the libraries fail (Yes in parentheses) the RT criterion isthe fact that they use temporary variables in the calculation of the inverse.

122

Page 137: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

E. SMRdemo E.1. Matrix Library Study

Name Reference

OpenCV [OpenCV, 2006]GNU Scientific Library [GSL, 2006]Meta Matrix Library [Frochte, 2006]The Matrix Template Library [Gottschling, 2006]IAUmat [Ravn et al., 2006]

(BLAS)T [BLAS, 2006]

Table E.1: Matrix library study - candidates

Name C/C++ size RT KALM Ext. LIN/WIN

OpenCV both big (Yes) Yes No YesGSL both med. (Yes) (Yes) BLAS YesMEML C small No No BLAS (Yes)MTL C++ small (Yes) No No YesIAUmat C small Yes Yes No (Yes)

Table E.2: Matrix library study - technical perspective

If the inverse is not needed, the basic matrix operations are soft real timecompatible. The only library that promises to be hard real time capable isthe IAUmat library.

Name User friendly doc Maturity

OpenCV no 1 good goodGSL yes 3 good very goodMEML no 2 bad mediumMTL yes 3 medium badIAUmat yes 3 medium medium

Table E.3: Matrix library study - subjective evaluation

Looking at all user feel of the libraries the three parameters Userfriendliness, Documentation and Maturity are reviewed. Looking at tableE.3 OpenCV and GSL are the winners. OpenCV is not meant as at realtime control toolbox and that shows. It is very good for the computervision for which it was designed, but it is very bulky and not easy to usefor control. GSL is a very used platform for scientific calculations and thebasic matrix calculations are deemed safe for soft real time calculations.There are a lot of other users of this product both in scientific computingand real time control. The notation of the GSL basic matrix operations isshorter and more intuitive than OpenCV.

OpenCV, GSL and IAUmat are the runners up for the new matrix

123

Page 138: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

E. SMRdemo E.1. Matrix Library Study

library. OpenCV has its problems in the Computer Vision background andit is not as intuitive, on the other hand it is proven to work with the presentKalman filter. GSL is very comprehensive, but it is possible to select theappropriate parts of the library. It has several users in the control segmentand offers many new features such as ODE solvers. IAUmat contains allthe basic matrix operations and it promises to be real time capable. Thelibrary has a short notation and due to its small size is is interesting. Thelibrary supports embedded processors and is capable of running under bothWindows and Linux.

To show the user friendliness of the libraries the demomat programhas been implemented in all three libraries, see section E.7.

The GSL library contains the following possibilities, which are not allreal time safe. If a very comprehensive tool is needed, GSL is mostfavorable.

Mathematical Functions

Complex Numbers

Polynomials

Special Functions

Vectors and Matrices

Permutations

Combinations

Sorting

BLAS Support

Linear Algebra

Eigensystems

Fast Fourier Transforms

Numerical Integration

Random Number Generation

Quasi-Random Sequences

Random Number Distributions

Statistics

Histograms

N-tuples

Monte Carlo Integration

Simulated Annealing

Ordinary Differential Equations

Interpolation

Numerical Differentiation

Chebyshev Approximations

Series Acceleration

Wavelet Transforms

124

Page 139: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

E. SMRdemo E.2. Kalman Filter

Discrete Hankel Transforms

One dimensional Root-Finding

One dimensional Minimization

Multidimensional Root-Finding

Multidimensional Minimization

Least-Squares Fitting

Nonlinear Least-Squares Fitting

Physical Constants

IEEE floating-point arithmetic

Debugging Numerical Programs

Contributors to GSL

Autoconf Macros

GSL CBLAS Library

Looking at the requirements, GSL is interesting, because it would give a lotof new functionality to IAU toolbox. The library is however not developedwith real time computation in mind, and that is a major drawback. IAUmatis the leanest and only library that promises real time capabilities. On thatbackground the IAUmat library is chosen for this project.

E.2 Kalman Filter

The implementation of the OpenCV and IAUmat libraries has been visuallyinspected. As can be seen both invert a matrix in order to calculate theKalman gain. The OpenCV uses SVD solver and IAUmat uses Gauss-Jordan elimination to compute the inverse. The two solutions use the textbook solution to finding the Kalman gain. Both of these run in C andC++.

If only C++ is used, the Bayesian Filtering Library is very interest-ing as an expansion to GSL. It is built for mobile robotics and supportsboth Kalman and particle filters in several versions. The support does notseem extensive, but it is recent and the Ph.d. was done under HermanBruyninckx2 supervision. The implementation is based on GSL and usesinversion of matrices as far as I can see. The library does however supportExtended Kalman Filter, Iterated Extended Kalman Filter, Non MinimalKalman Filter, Square Root iterated Kalman Filter (Limited), ASIRParticle Filter, Bootstrap Particle Filter and Optimal Importance ParticleFilter.Like the Bayesian Filtering Library most of the extensions written for GSLfiltering purposes are written in C++, which does not conform with thespecifications.

2The head of orocos.org

125

Page 140: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

E. SMRdemo E.3. Configuration of SMRdemo by XML

E.3 Configuration of SMRdemo by XML

The idea of using XML in configuration of the SMRdemo robot server isbased on the more intuitive data format with the possibility of easy namingof parameters, exclusion of parameters not known by the calling programand comments in the configuration file. Solutions for parsing XML filesexist, which leaves structuring the data in a sensible way to the programmer.

It should be discussed if it is a good idea to use the Simulator con-figuration files for SMRdemo. This would guarantee consistency in the filesused and the XML format supports throwing away information that is notneeded.

A widely used XML parser is Expat, which is used in programs likethe Mozilla webbrowser and other proven products. Expat will therefore beused in this project.

To make sure the configuration is done properly a debug tag hasbeen developed.

<debug />

The result is a file output debug.conf in the calling directory, containingthe parsed information used in the initial configuration of the robot. Bewarethat faulty calibration files from the SMRdemo program can ruin the initialconfiguration.

E.4 Operating the Kalman Filter

The filter execution can be managed using two SMRdemo commands. Onefor running the Kalman filter and one for using the output.

set "kalmanon" 1 [Enable Kalman filter]

set "kalmanon" 0 [Disable Kalman filter]

set "usekalmanodo" 1 [Use estimate in the

motion control]

set "usekalmanodo" 0 [Use internal representation

in motion control]

Configuration of the filter is carried out in the robot configuration file.The section of code used is described in the report and in the example robotconfiguration files included at the back of appendix F and appendix G.

126

Page 141: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

E. SMRdemo E.5. Dynamic Libraries for SMRdemo

E.5 Dynamic Libraries for SMRdemo

New expansions to the SMRdemo program are to be made at dynamic li-braries. The reason for not linking the modules into SMRdemo at compiletime is, that it opens up for problems with people editing the main programand thereby causing problems.

E.6 SMRdemo Compile

To compile the SMRdemo program, follow these instructions.

shorthand notation remember

export LD_LIBRARY_PATH=/usr/local/lib

cd gps

make

cd robotinc

make

cd smrdemo

make

127

Page 142: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

E. SMRdemo E.7. Matrix Operation Implementation

E.7 Matrix Operation Implementation

This section shows the implementation of the demomat program using threedifferent libraries: iau mat, OpenCV and GSL.

IAUmat implementation

/∗∗\ f i l e demomat . c∗ \ b r i e f Demonstration o f the ba s i c matrix operat ion s in

iau mat .∗ The idea i s to use t h i s as a benchmark for the other matrix

l i b r a r y∗ cand idate s . O r i g i n a l demomat . c wr i t t en by Ole Ravn .∗∗ \ author Lars Valdemar Mogensen∗ \ author Ole Ravn∗∗ \date 05/15−2006∗ \version 1 .0∗/

#inc lude <s t d i o . h>

#inc lude <s t d l i b . h>

#inc lude <iau mat . h>

/∗∗ \ fn main ( i n t argc , char ∗∗ argv )∗ Test and demo r u t i n e for the matrix l i b r a r y .∗ Simple operat ion s are c a r r i ed out and d i sp layed .∗∗ \param [ in ] argc Not used∗ \param [ in ] ∗∗ argv Not used∗/

in t main ( i n t argc , char ∗∗ argv ){

matrix ∗a , ∗b , ∗ c ;

/∗ Al locat e mat r i c e s ∗/

a = mmake(3 ,3 ) ;b = mmake(3 ,3 ) ;c = mmake(3 ,3 ) ;

/∗ I n i t i a l i s e and do c a l c u l a t i o n s ∗/

minitx ( a , 5 . 5 ) ; // I i n i t i a l i z e a l l e lements to 5 .5

mdiag (b , 2 . 0 ) ; // I n i t i a l i z e to i d e n t i t y and mult ip ly by2 .0

smul (b , b , 5 ) ; // Sca l e b by 5 .0

madd( c , a , b) ; // Perform c = a + b

128

Page 143: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

E. SMRdemo E.7. Matrix Operation Implementation

/∗ Output r e s u l t s ∗/

p r i n t f (” Matrix a = \n”) ;mprint ( a ) ;p r i n t f (” Matrix b = \n”) ;mprint (b) ;p r i n t f (” Matrix c = a + b \n”) ;mprint ( c ) ;

/∗ Free memory ∗/

mfree ( a ) ;mfree (b) ;mfree ( c ) ;

return 0 ;}

Output from the program demomat:

Matrix a =

5.500000 5.500000 5.500000

5.500000 5.500000 5.500000

5.500000 5.500000 5.500000

Matrix b =

10.000000 0.000000 0.000000

0.000000 10.000000 0.000000

0.000000 0.000000 10.000000

Matrix c = a + b

15.500000 5.500000 5.500000

5.500000 15.500000 5.500000

5.500000 5.500000 15.500000

129

Page 144: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

E. SMRdemo E.7. Matrix Operation Implementation

OpenCV implementation

/∗∗\ f i l e demoopencv . c∗ Demonstration o f the ba s i c matrix operat ion s in OpenCV∗ The idea i s to implement the e s s e n t i a l s o f the demomat . c∗ f i l e in the iau mat l i b r a r y . The main reason i s to se∗ i f the OpenCV l i b r a r y i s s u f f i c i n e t l y user f r i e n d l y .∗ \ author Lars Valdemar Mogensen∗∗ \date 05/14−2006∗/

#inc lude <s t d i o . h>

#inc lude <s t d l i b . h>

#inc lude <cv . h>

// aux i l a r y for p r in t i n g mat r i c e svoid cvmPrint (CvMat∗ M){ i n t i =0, j =0;

for ( i =0; i < M−>rows ; i++){

for ( j = 0 ; j < M−>c o l s ; j++){

p r i n t f (”%f ” , cvmGet (M, i , j ) ) ;// p r i n t f (”%f ” , cvmGet (M, i , j ) ) ;

}p r i n t f (”\n”) ;

}p r i n t f (”\n”) ;

}

main ( i n t argc , char ∗∗ argv ){

CvMat ∗a , ∗b , ∗ c ;CvScalar alpha ;

/∗ Al locat e mat r i c e s − double p r e c i s i o n f l o a t i n g point real

va lue s ∗/

a = cvCreateMat ( 3 , 3 , CV 64FC1 ) ;b = cvCreateMat ( 3 , 3 , CV 64FC1 ) ;c = cvCreateMat ( 3 , 3 , CV 64FC1 ) ;

/∗ I n i t i a l i s e and do c a l c u l a t i o n s ∗/

alpha = cvRealScalar ( 5 . 5 ) ;cvSet (a , alpha , 0 ) ; // I i n i t i a l i z e a l l e lements to 5 .5

alpha = cvRealScalar ( 2 . 0 ) ;c vS e t I d en t i t y (b , alpha ) ; // I n i t i a l i z e to i d e n t i t y and

mult ip ly by 2 .0

130

Page 145: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

E. SMRdemo E.7. Matrix Operation Implementation

// Use the matrix−s ca l e −add function to mult iply , no s c a l efunction found .

// Functions cannot accept empty arguments l i k e NULL og 0// There i s a cvmScale ( ) function , but i t does not compile{

CvMat ∗tmp ;tmp = cvCloneMat (b) ;cvZero (tmp) ;

alpha = cvRealScalar ( 5 . 0 ) ;cvScaleAdd (b , alpha , tmp , b ) ; // Perform b = s c a l e ∗b with c

= s c a l e ∗a + b

cvReleaseMat(&tmp) ;}

// Use the matrix−s ca l e −add function to add matr ices , no addfunction found .

alpha = cvRealScalar ( 1 . 0 ) ;cvScaleAdd(a , alpha , b , c ) ; // Perform c = s c a l e ∗a + b , with

s c a l e = 1 .0

/∗ Output r e s u l t s ∗/

p r i n t f (” Matrix a = \n”) ;cvmPrint ( a ) ; // Not enc lo sed in o r i g i n a l packagep r i n t f (” Matrix b = \n”) ;cvmPrint (b) ; // Not enc lo sed in o r i g i n a l packagep r i n t f (” Matrix c = a + b \n”) ;cvmPrint ( c ) ; // Not enc lo sed in o r i g i n a l package

/∗ Free memory ∗/

cvReleaseMat(&a ) ;cvReleaseMat(&b) ;cvReleaseMat(&c ) ;

}

131

Page 146: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

E. SMRdemo E.7. Matrix Operation Implementation

Output from the program demoopencv:

Matrix a =

5.500000 5.500000 5.500000

5.500000 5.500000 5.500000

5.500000 5.500000 5.500000

Matrix b =

10.000000 0.000000 0.000000

0.000000 10.000000 0.000000

0.000000 0.000000 10.000000

Matrix c = a + b

15.500000 5.500000 5.500000

5.500000 15.500000 5.500000

5.500000 5.500000 15.500000

132

Page 147: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

E. SMRdemo E.7. Matrix Operation Implementation

GSL implementation

/∗∗\ f i l e demogsl . c∗ Demonstration o f the ba s i c matrix operat ion s in GSL∗ The idea i s to implement the e s s e n t i a l s o f the demomat . c∗ f i l e in the iau mat l i b r a r y . The main reason i s to se∗ i f the GSL l i b r a r y i s s u f f i c i n e t l y user f r i e n d l y .∗ \ author Lars Valdemar Mogensen∗∗ \date 05/15−2006∗/

#inc lude <s t d i o . h>

#inc lude <s t d l i b . h>

#inc lude <g s l mat r i x . h>

main ( i n t argc , char ∗∗ argv ){

g s l mat r i x ∗a , ∗b , ∗c ;

/∗ Al locat e mat r i c e s − double p r e c i s i o n f l o a t i n g point real

va lue s ∗/

a = g s l m a t r i x a l l o c (3 ,3 ) ;b = g s l m a t r i x a l l o c (3 ,3 ) ;c = g s l m a t r i x a l l o c (3 ,3 ) ;

/∗ I n i t i a l i s e and do c a l c u l a t i o n s ∗/

g s l m a t r i x s e t a l l ( a , 5 . 5 ) ; // I i n i t i a l i z e a l l e lements to5 .5

g s l m a t r i x s e t i d e n t i t y (b) ;g s l ma t r i x s c a l e (b , 2 . 0 ) ; // I n i t i a l i z e to i d e n t i t y and

mult ip ly by 2 .0

g s l ma t r i x s c a l e (b , 5 . 0 ) ; // Sca l e b by 5 .0

gsl matrix memcpy ( c , a ) ; // Perform c = a + bgs l mat r ix add ( c , b ) ;

/∗ Output r e s u l t s ∗/

p r i n t f (” Matrix a = \n”) ;g s l m a t r i x f p r i n t f ( stdout , a ,”\ t%f ”) ;p r i n t f (” Matrix b = \n”) ;g s l m a t r i x f p r i n t f ( stdout , b ,”\ t%f ”) ;p r i n t f (” Matrix c = a + b \n”) ;g s l m a t r i x f p r i n t f ( stdout , c ,”\ t%f ”) ;

/∗ Free memory ∗/

g s l ma t r i x f r e e ( a ) ;g s l ma t r i x f r e e (b ) ;

133

Page 148: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

E. SMRdemo E.7. Matrix Operation Implementation

g s l ma t r i x f r e e ( c ) ;}

134

Page 149: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

E. SMRdemo E.7. Matrix Operation Implementation

Output from the program demogsl:

Matrix a =

5.500000

5.500000

5.500000

5.500000

5.500000

5.500000

5.500000

5.500000

5.500000

Matrix b =

10.000000

0.000000

0.000000

0.000000

10.000000

0.000000

0.000000

0.000000

10.000000

Matrix c = a + b

15.500000

5.500000

5.500000

5.500000

15.500000

5.500000

5.500000

5.500000

15.500000

135

Page 150: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

E. SMRdemo E.8. SMRdemo Flowcharts

E.8 SMRdemo Flowcharts

The flow charts need a little explanation, for the reader to understand thecolor codes used. As can be seen from figure E.1 the new main loop isa combination of figure E.2 and figure E.3. The three last flowcharts aredescribe the operation off all the rest of the code.

Color Action

Purple Reading sensorsGreen Transforming dataLight blue EstimationOrange Motion controlBlue Actuator updateRed Signal treatment for motion controlBlack dotted Sub block with own flowchartBlack Not Categorized

Table E.4: Flowchart color code

136

Page 151: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

E. SMRdemo E.8. SMRdemo Flowcharts

Figure E.1: Robot main loop routine - New.

137

Page 152: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

E. SMRdemo E.8. SMRdemo Flowcharts

Figure E.2: smrd controlledmain loop routine.

Figure E.3: HAKO main looproutine.

138

Page 153: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

E. SMRdemo E.8. SMRdemo Flowcharts

Figure E.4: SMRdemo start routine.

139

Page 154: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

E. SMRdemo E.8. SMRdemo Flowcharts

Figure E.5: SMRdemo init routine.

140

Page 155: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

E. SMRdemo E.8. SMRdemo Flowcharts

Figure E.6: SMRdemo main loop.

141

Page 156: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1
Page 157: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Appendix F

Ackerman Steered Vehicles

This appendix is dedicated to all the information about the HAKO tractorthat is need to know, but no where to be found.

F.1 Modeling

This model is not unique for the HAKO tractor, but holds for all Ackermansteered vehicles when simplified to the uni-cycle model. The continuousmodel of the Ackerman steered vehicle is depicted in figure F.1 to presentthe variables and geometry.

y

L

x

α

R

θ

φ

Figure F.1: Ackerman model in continuous time.

143

Page 158: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

F. Ackerman Steered Vehicles F.1. Modeling

F.1.1 Model Dependent Parameters

Lenght of the vehicle from the axle of the non steering wheels to the steeringaxle.

L = 1.22m (F.1)

Continuous Time

The general nonlinear model for systems of this type is usually given as:

˙x = f(x, u, vp) vp ∈ N(0, Rp) (F.2)

bary = g(x, u, vm) vm ∈ N(0, Rm) (F.3)

but in the case of the Ackerman steered vehicle it is possible to formulatethe odometric model for the vehicle in continuous time.

The equations that describe the nonlinear continuous system are sum-marized below. x and y are the coordinates for the center point between theback wheels, θ is the heading of the vehicle, φ is the steering angle and v isthe forward speed for the vehicle.

x =

xyθ

(F.4)

v = robot speed (F.5)

θ =v

Ltan(φ) (F.6)

˙x =

cos(θ)sin(θ)

0

v +

001

θ (F.7)

F.1.2 Discrete non Linear System Equations

The definitions used in the system equations are presented here and also thetransformations from control signals to system states.The calculation from steer angle φk to change in vehicle heading θk.

δθk =δdk

L· tan(φk) (F.8)

The system states are represented as xk and contains the vehicles positionalrepresentation as coordinates x, y and vehicle heading θ.

xk =

xyθ

k

(F.9)

144

Page 159: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

F. Ackerman Steered Vehicles F.1. Modeling

x

yk

k

x∆

θ R

x k+1

yk+1 ∆ y

y

x

φ

L φ ∆θ

k

k

Figure F.2: Ackerman model in discrete time.

The system control input is represented as uk and contains the covereddistance in one sample δdk and the steer angle φk.

uk =

[

δdk

φk

]

(F.10)

Combining all this into an odometric model of the vehicle gives the followingmodel.

xk+1 = xk + δdk · cos(θk +δθk

2) (F.11)

yk+1 = yk + δdk · sin(θk +δθk

2) (F.12)

θk+1 = θk + δθk (F.13)

F.1.3 Linear System Matrices

In the derivation of the jacobians a simplification of the model has beenmade. The assumption of the half angle in the time update for the positionhas been omitted in the differentiation.

Fk =δf

δx

xk=x+

k,vp=0

=

1 0 −δdk · sin(θk)0 1 δdk · cos(θk)0 0 1

(F.14)

Gk =δf

δu

xk=x+

k,vp=0

=

cos(θk) 0sin(θk) 0tan(φ)

Ldk

1cos(φ)2

(F.15)

145

Page 160: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

F. Ackerman Steered Vehicles F.1. Modeling

Gvp =δf

δvp

xk=x+

k

=

cos(θk) 0sin(θk) 0tan(φ)

Ldk

1cos(φ)2

(F.16)

Ck =δg

δx

xk=x+

k,vm=0

=

[

1 0 00 1 0

]

(F.17)

Dk =δg

δu

xk=x+

k,vm=0

=

[

0 00 0

]

(F.18)

Gvm =δg

δvm

xk=x+

k

=

[

1 00 1

]

(F.19)

146

Page 161: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

F. Ackerman Steered Vehicles F.2. HAKO Commands

F.2 HAKO Commands

The HAKO tractor has some commands the are not described in the avail-able SMR-CL manual. The commands are implemented to make the HAKOcapable of driving several runs without restarting the SMDdemo server.

F.3 Driving the HAKO

For running the HAKO tractor the physical HAKO / Simulator must berunning. The software control daemon is then started and after that theweb page is loaded.

html/hakoctl/bin

set ”hakomanual” 0 defines if the tractor is in manual mode 1 or in au-tomatic mode 0. To drive the tractor in the simulator the value mustbe automatic mode 0.

F.4 Logging Data

These commands are for the data logging feature. The reason for imple-menting this is the need for several log files and being able to drive theHAKO around between tests without logging being performed.

log ”$var” ”..” is the normal definition of the variables to log. This isused to initialize the log.

control ”startlog” is used to start logging the data.

control ”savelog” is used to stop logging the data.

control ”resetlog” is used to clear the present log.

control ”stoplog” is used to stop logging data, but not delete the presentlog.

control ”removelogvars” frees the memory used for storing the data forthe log.

If this routine is kept the log can be initialized again and another batchof data collected. The current implementation only supports these featuresfor the HAKO tractor.

147

Page 162: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

F. Ackerman Steered Vehicles F.5. XML Configuration HAKO Tractor

F.5 XML Configuration HAKO Tractor

<?xml version=”1.0” ?>

<!−−Conf igurat ion f i l e for SMRdemoThis f i l e r i s for the HAKO t r a c t o r

Robot typesd i f f e r e n t i a l , ackerman

Namesmr , mmr, ackerbot , hako−−>

<r obo t i n f otype= ”ackerman”name=”hako”

><debug/><odometryc l=” 0.07”cr=” 0.000”w =” 0.26”robot l ength =”1.22”t s=” 0.1”maxtick=” 1000”con t r o l= ”0”enctype=”1”/>

<!−−In format ion to connect to the Sensor Fusionkalman f i l t e r .type : S p e c i f i e s the f i l t e r type , but i s

cu r r en t l y not used .run : Flag to i n d i c a t e i f the f i l t e r

should run .use : Flag to i n d i c a t e i f the es t imate

should be used .

measurements std x : Standard dev ia t i on onGPS no i s e ea s t i n g

148

Page 163: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

F. Ackerman Steered Vehicles F.5. XML Configuration HAKO Tractor

measurements std y : Standard dev ia t i on onGPS no i s e nor th ing

p r o c e s s n o i s e s t d an g l e : Standard dev ia t i on onangle no i s e

p r o c e s s n o i s e s t d d i s t : Standard dev ia t i on ondr iven d i s t an c e

g p s o f f s e t e a s t i n g : Parameter to be used fori n d i c a t i n g approximatep o s i t i o n o f GPS

gp s o f f s e t n o r t h i n g : Parameter to be used fori n d i c a t i n g approximatep o s i t i o n o f GPS

−−>

<f i l t e rtype=”EKF”run=”1”use=”1”

measurementnoise std x =”0.1”measurementnoise std y =”0.11”p r o c e s s n o i s e s t d s t e a r i n g an g l e =”0.01”p r o c e s s n o i s e s t d d i s t =”0.02”

g p s o f f s e t e a s t i n g =”700000”gp s o f f s e t n o r t h i n g =”6100000”></ f i l t e r >

<motioncontro lt s =”0.1”l i n e g a i n =”0.1”l i n e t a u =”0”l i n e a l f a =”0.04”wa l l ga i n =”1”wa l l tau =”0.7”w a l l a l f a =”0.2”d r i v e kd i s t =”1.0”d r i v e kang l e =”2.0”gain =”0”tau =”0.16”a l f a =”0.2”w =”0.26”

149

Page 164: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

F. Ackerman Steered Vehicles F.5. XML Configuration HAKO Tractor

robot l ength =”1.22”lim =”0.38”s t op d i s t =”0.18”a la rmd i s t =”0.1”velcmd =”0.5”acccmd =”0.5”n o l i n e d i s t =”0.2”/>

<motorcontro lv e l s c a l e l =”1”v e l s c a l e r =”1”kp =”66”k i =”5”/>

</robot in f o>

150

Page 165: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Appendix G

Differentially SteeredVehicles

This section is dedicated to the information needed for operating the SMR’sand MMR’s that is otherwise difficult to acquire.

G.1 Modeling

This model is not unique for the SMR vehicle, but holds for all differentiallysteered vehicles. The continuous model of the differentially steered vehicleis depicted in figure G.1 to present the variables and geometry.

y

x

θ

B

d wl

dwl

Figure G.1: Differential model in continuous time.

The equations used in the continuous model is depicted in equations G.2

151

Page 166: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

G. Differentially Steered Vehicles G.1. Modeling

to G.4. The states in the model x and y are the coordinates for the centerpoint between the back wheels and θ is the heading of the vehicle. Inputsignals to the model are θ which is the change in angle and v which is theforward speed for the vehicle. The following model dependent parametersare used. vwl and vwr are the left and right wheel speed calculated from thewheel encoder movement scaled by the wheel diameters dwl and dwr. B isthe wheel distance, used for calculating the turn rate of the vehicle.

x =

xyθ

(G.1)

v =vwr + vwl

2(G.2)

θ =vwr − vwl

B(G.3)

˙x =

cos(θ)sin(θ)

0

v +

001

θ (G.4)

G.1.1 Model Dependent Parameters

In the vehicles used the system dependent parameters has already been usedby the SMRdemo sub program and is therefore not needed in the filter forcalculations.

G.1.2 Discrete non Linear System Equations

The definitions used in the system equations are presented here and alsothe transformations from control signals to system states.The values used as input to the filter are already available in the internal

robot struct, and does not need to be calculated as it is the case with theAckerman steered vehicle, see appendix F.

The system states are represented as xk and contains the vehicle odometricrepresentation as coordinates x, y and vehicle heading θ.

xk =

xyθ

k

(G.5)

The system control input is represented as uk and contains the covereddistance δdk and the change in rotation of the robot δθk.

uk =

[

δdk

δθk

]

(G.6)

152

Page 167: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

G. Differentially Steered Vehicles G.1. Modeling

x

k

k

x∆

∆θ

x k+1

yk+1 ∆ y

y

x

y

θ

Β

d wr

d wl

k

k

Figure G.2: Differential model in discrete time.

Combining all this into an odometric model of the vehicle gives the followingmodel.

xk+1 = xk + δdk · cos(θk +δθk

2) (G.7)

yk+1 = yk + δdk · sin(θk +δθk

2) (G.8)

θk+1 = θk + δθk (G.9)

This is the same odometric model as for the Ackerman steered vehicle, butsince the input vector to the model is not the same, the linear mode usedin the filter algorithms will not be entirely the same.

G.1.3 Linear System Matrices

In the derivation of the jacobians a simplification of the model has beenmade. The assumption of the half angle in the time update for the positionhas been omitted in the differentiation.

Fk =δf

δx=

1 0 −δdk · sin(θk)0 1 δdk · cos(θk)0 0 1

(G.10)

Gk =δf

δu=

cos(θk) 0sin(θk) 0

0 1

(G.11)

153

Page 168: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

G. Differentially Steered Vehicles G.2. SMR Simulator

Gvp =δf

δvp=

cos(θk) 0sin(θk) 0

0 1

(G.12)

Ck =δg

δx=

[

1 0 00 1 0

]

(G.13)

Dk =δg

δu=

[

0 00 0

]

(G.14)

Gvm =δg

δvm=

[

1 00 1

]

(G.15)

G.2 SMR Simulator

The test of the SMR’s in the simulator was not optimal in the first test runs.To get around this problem the following standard values can be used, tomake the simulation and internal representation be alike.

G.2.1 hostname demo odo calib.dat

The format of the file is.

(wheelbase [m]) (relative difference in wheel radii)...

(encoder resolution in [m/tick])

The default values used in this project are.

0.260000 1.000000 0.00010245

G.2.2 hostname demo ir calib.dat

The format in this file is.

offset scaling

154

Page 169: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

G. Differentially Steered Vehicles G.2. SMR Simulator

Sensor number Position

0 Left side1 Front left side2 Front middle3 Front right side4 Right side5 Rear

Table G.1: IR distance sensor position

This is repeated for all the IR distance sensors used on the robot.The calibration file supports 6 sensors. The default values used in thisproject are.

15.00 0.00

15.00 0.00

15.00 0.00

15.00 0.00

15.00 0.00

15.00 0.00

The sensor position is defined as listed in table G.1.

155

Page 170: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

G. Differentially Steered Vehicles G.2. SMR Simulator

G.2.3 hostname demo ls calib.dat

The format of this file is.

calibration value color and sensor number

The values used in the simulations are.

70.00 ;max_white(1)

70.00 ;max_white(2)

70.00 ;max_white(3)

70.00 ;max_white(4)

70.00 ;max_white(5)

70.00 ;max_white(6)

70.00 ;max_white(7)

70.00 ;max_white(8)

50.00 ;max_black(1)

50.00 ;max_black(2)

50.00 ;max_black(3)

50.00 ;max_black(4)

50.00 ;max_black(5)

50.00 ;max_black(6)

50.00 ;max_black(7)

50.00 ;max_black(8)

156

Page 171: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

G. Differentially Steered Vehicles G.3. XML Configuration

G.3 XML Configuration

G.3.1 Small Mobile Robot

Example configuration file for the SMR, which does not contain a sensorfusion Kalman filter.

<?xml version=”1.0” ?>

<!−−Conf igurat ion f i l e for SMRdemoThis f i l e i s for the SMR

Robot typesd i f f e r e n t i a l , ackerman

Namesmr , mmr, ackerbot , hako−−>

<r obo t i n f otype=”d i f f e r e n t i a l ”name=”smr”

><debug/>

<odometryc l =”0.00010245”cr =”0.00010245”w =”0.26”t s =”0.01”maxtick =”1300”con t r o l =”0”

/>

<motioncontro lt s =”0.01”l i n e g a i n =”0.05”l i n e t a u =”20”l i n e a l f a =”0.3”wa l l ga i n =”1”wa l l tau =”0.7”wa l l a l f a =”0.2”d r i v e kd i s t =”3.2”d r i v e kang l e =”1.5”

157

Page 172: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

G. Differentially Steered Vehicles G.3. XML Configuration

gain =”1”tau =”0.16”a l f a =”0.2”w =”0.26”lim =”0.2”s t op d i s t =”0.18”a la rmd i s t =”0.1”velcmd =”0.3”acccmd =”0.3”n o l i n e d i s t =”0.2”

/>

<motorcontro lv e l s c a l e l = ”100”v e l s c a l e r =”100”kp =”66”k i =”5”

/>

< l i n e s e n s o rs ize =”8”k f i l t =”0.8”

/>

< i r s e n s o r/>

</robot in f o>

158

Page 173: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

G. Differentially Steered Vehicles G.3. XML Configuration

G.3.2 Medium Mobile Robot

Example configuration file for the MMR, which contains a sensor fusionKalman filter tuned for the MMR.

<?xml version=”1.0” ?>

<!−−Conf igurat ion f i l e for SMRdemoThis i s for the MMR robot

Robot typesd i f f e r e n t i a l , ackerman

Namesmr , mmr, ackerbot , hako−−>

<r obo t i n f otype=”d i f f e r e n t i a l ”name=”mmr”>

<debug/>

<odometryc l= ”0.00011505”cr= ”0.00011505”w = ”0.46”t s= ”0.01”maxtick = ”256”con t r o l = ”0”/>

<motioncontro lt s =”0.01”l i n e g a i n =”0.05”l i n e t a u =”20”l i n e a l f a =”0.3”wa l l ga i n =”1”wa l l tau =”0.7”w a l l a l f a =”0.2”gain =”1”tau =”0.16”a l f a =”0.2”w =”0.46”

159

Page 174: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

G. Differentially Steered Vehicles G.3. XML Configuration

lim =”0.5”s t op d i s t =”0.4”a la rmd i s t =”0.1”velcmd =”0.3”acccmd =”0.3”n o l i n e d i s t =”0.2”d r i v e kd i s t =”3.2”d r i v e kang l e =”1.5”/>

<motorcontro lv e l s c a l e l =”27.4”v e l s c a l e r =”27.4”kp =”66”k i =”5”/>

<!−−In format ion to connect to the Sensor Fusionkalman f i l t e r .type : S p e c i f i e s the f i l t e r type , but i s

cu r r en t l y not used .run : Flag to i n d i c a t e i f the f i l t e r

should run .use : Flag to i n d i c a t e i f the es t imate

should be used .

measurements std x : Standard dev ia t i on onGPS no i s e ea s t i n g

measurements std y : Standard dev ia t i on onGPS no i s e nor th ing

p r o c e s s n o i s e s t d an g l e : Standard dev ia t i on onangle no i s e

p r o c e s s n o i s e s t d d i s t : Standard dev ia t i on ondr iven d i s t an c e

g p s o f f s e t e a s t i n g : Parameter to be used fori n d i c a t i n g approximatep o s i t i o n o f GPS

gp s o f f s e t n o r t h i n g : Parameter to be used fori n d i c a t i n g approximatep o s i t i o n o f GPS

−−>

160

Page 175: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

G. Differentially Steered Vehicles G.3. XML Configuration

<f i l t e rtype=”EKF”run=”1”use=”0”

measurementnoise std x=”3”measurementnoise std y=”3”p r o c e s s n o i s e s t d an g l e =”0.001”p r o c e s s n o i s e s t d d i s t =”0.002”

g p s o f f s e t e a s t i n g =”700000”gp s o f f s e t n o r t h i n g =”6100000”></ f i l t e r >

<!−−In format ion to connect to the UTMgpsd s e r v e rhostname : s p e c i f i e s the hostname o f the s e r v e r

Where the UTMgpsd s e r v e r i s runningpost : Port numberSBAS: Def ine i f the WAAS/EGNOSS augmentation

system i s suppposed to be turned on/o f f [ 0 / 1 ]

−−>

<gpsmousehostname =”l o c a l h o s t ”port =”9500”SBAS =”0”run =”1”use =”1”/>

</robot in f o>

161

Page 176: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1
Page 177: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Appendix H

GPS under Linux

Efforts has been made to find a GPS client/server that has the followingpossibilities.

• Support of the NMEA-0183 protocol which is the standard protocolfor cheap GPS receivers.

• Socket interface. This is to keep the SMRdemo interfacing simple.

• UTM coordinate system as output.

H.1 Linux Distribution GPS Server

There are several systems available, but they do not support UTM asoutput. The primary candidate to serve as a GPS server is gpsd and theclient to test the system is xgps. The program works very well but itsupports Lat/Lon coordinate system which is not directly compatible withthe normal Cartesian coordinate system with meter representation usuallyused in robotics. UTM is such a system, and a server supporting this wouldbe more desirable.

The test of gpsd and xgps was performed in the parking lot at building326 on DTU. The programs were downloaded with apt with a GPS mousemodel GM-210-UGR-1 from HOLUX.

Operation:

dmesg (to find the port to which the GPS is connected)

gpsd -N -n (-D2) -p /dev/ttyUSB0 (/dev/ttyUSB0 is the port)

xpgs

163

Page 178: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

H. GPS under Linux H.2. In-house GPS Server

The -D2 option shows debug output from the server, in this case the NMEAoutput which is very good for monitoring the status of the GPS.

The server is interesting, but it is not chosen because of the lack ofUTM output.

H.2 In-house GPS Server

One server has been written to support UTM coordinates in Niels Holm-gaards Master thesis [Holmgaard, 2004]. The structure is not in compliancewith the SMRdemo idea, since the server works by broadcast and is limitedto the robot subnet, according to Ph.d. Jens Christian Andersen. Themajor problem in using this server is that it does not support EGNOS andthere are several other information like satellites used that are interesting,but not available. Time has been spend looking for a way to use the existingcode, but it is very entangled and new functionality is not easy to implement.

Inspection of log files recorded by an IAU Ph.d. Jens Christian Andersenhas shown that there is an error in the Lat/Lon to UTM transformation,resulting in an error of roughly 1.8 km mainly in the northing coordinate.

The result of this evaluation of the existing server means, that a newserver will be proposed. The server is to be based in a structure that is incompliance with the robot control software, uses a correct Lat/Lon to UTMconversion and is capable of using the EGNOS feature of the receiver.

H.3 UTMgpsd

To get the right facilities in the GPS SW, and the most documented andeasy to use interface, it was decided to write a new server. The objective isto write it in compliance with the SMRdemo standard and to make the useof the GPS invisible to the normal user.

H.3.1 Socket Server

The frame work for the socket server is taken from the master project[Nielsen & Mejnertsen, 2006], but modified with a new protocol. There aresome problems with the stability when a client is disconnected, but thisproblem has not been addressed since the GPS server is started separatelyin every sample. Handling of this problem is currently done by manuallymaking sure the server is running when a client is to be connected.

164

Page 179: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

H. GPS under Linux H.4. Libgps

H.3.2 GPS Communication

The GPS communication is done by opening a serial connection to the GPSreceiver. Parsing the text strings received from the GPS is based on aprotocol description from [Zogg, 2002], but a more detailed description ofthe parsing is available in the source code.

H.3.3 EGNOS

One of the reasons for writing the new GPS server was the absence of SBAScapability in the accessible servers. Enabling the european SBAS systemEGNOS in the reciever is done by sending the reciever a text string. Thereare different enabling strings for different GPS recievers so this solution isfor the HOLUX GM-210 and recievers running the same protocol.

Command

$PSRF108,01*03\r\n "Enabling the EGNOS feature"

$PSRF108,00*02\r\n "Disabling the EGNOS feature"

Reply

$PSRF151,01*0F\r\n "EGNOS feature enabled"

$PSRF151,00*0E\r\n "EGNOS feature disabled"

The EGNOS SIS is broadcast by the ESA satellite ARTEMIS (PRN 124/ GID37), Inmarsat AOR-E (PRN120 / GID23) and IOR-W (PRN126 /GID39) [ESA, 2006], [Kohne & Woßner, 2005] and [GARMIN, 2000].

H.4 Libgps

Libgps is the interface library needed to make use of UTMgpsd server aseasy as possible. The connection to SMRdemo is explained in the on-linedocumentation for the program.

H.5 Gpsclient

To debug the UTMgpsd server and to make the server data accessible GPSclient has been written. The source code for the GPS client should alsoserve as an example of how to interface to the UTMgpsd server by usingthe rutines in libgps.a. The user interface is made using the Linux ncurseslibrary, and can be seen in figure H.1.

H.6 Test Results

The system has been tested in real life, and the result is not entirely asexpected. The testing period was approximately 5 minutes for the non

165

Page 180: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

H. GPS under Linux H.6. Test Results

Figure H.1: GPS client for debug and GPS measurement collection.

EGNOS test and 8 minutes for the EGNOS test.

The idea was that the EGNOS should give a better result, but thiswas not the case. The test is preliminary because of the short time horizonof the test. The EGNOS feature is implemented in the client/server systembut activating it is the users responsibility.

EGNOS σeasting σnorthing

on 1.48 m 1.26 moff 1.28 m 0.98 m

Table H.1: GPS noise recorded in the initial phase of the project.

166

Page 181: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

H. GPS under Linux H.6. Test Results

7 8 9 10 11 126

6.5

7

7.5

8

8.5

9

9.5

10

10.5

11

GPS test 24/04−06

World Easting [m] +720100 (UTM32)

Wor

ld N

orth

ing

[m]

+61

8751

0 (U

TM

32)

GPS wo. EGNOS

Figure H.2: Test of GPS with EGNOS disabled.

6 7 8 9 10 11 12 13 14

4

5

6

7

8

9

GPS test 24/04−06

World Easting [m] +720100 (UTM32)

Wor

ld N

orth

ing

[m]

+61

8751

0 (U

TM

32)

GPS w. EGNOS

Figure H.3: Test of GPS with EGNOS enabled.

167

Page 182: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

H. GPS under Linux H.6. Test Results

5 6 7 8 9 10 11 12 13 14

4

5

6

7

8

9

10

11

GPS test 24/04−06

World Easting [m] +720100 (UTM32)

Wor

ld N

orth

ing

[m]

+61

8751

0 (U

TM

32)

GPS wo. EGNOSGPS w. EGNOS

Figure H.4: Test of GPS, comparing of the results of the tests.

168

Page 183: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

H. GPS under Linux H.7. Lat/Lon to UTM Algorithm

H.7 Lat/Lon to UTM Algorithm

int latlon2UTM(gpstype* gps, int zone)

{

//Internal variables

double lat_mm=0,lat_dd=0,lat_ss=0,lon_mm=0,lon_dd=0,lon_ss=0;

double lat_deg=0,lon_deg=0,lat=0,lon=0,zone_CM=0,delta_lon=0;

//Datum constants

double a=6378137.0;

double b=6356752.314;

double k0=0.9996; //Scale factor

double e=sqrt(1-pow((b/a),2)); //Eccentricity

double e_2=e*e/(1-e*e);

double n=(a-b)/(a+b);

double nu=0;

//Calcualte Meridional Arc Length

double A0=a*(1-n+ (5*n*n/4)*(1-n) + (81*pow(n,4)/64)*(1-n));

double B0=(3*a*n/2)*(1 - n - (7*n*n/8)*(1-n) + 55*pow(n,4)/64);

double C0=(15*a*n*n/16)*(1 - n +(3*n*n/4)*(1-n));

double D0=(35*a*pow(n,3)/48)*(1 - n + 11*n*n/16);

double E0=(315*a*pow(n,4)/51)*(1-n);

double S=0;

//Calculate constants

double p=0,sin_1=0;

//Coefficients for UTM coordinates

double Ki=0,Kii=0,Kiv=0,Kv=0;

double Kiii=0;

if(gps->valid == 1) {

lat_mm = fmod(gps->latlon.lat,100);

lat_dd = (gps->latlon.lat-lat_mm)/100;

lon_mm = fmod(gps->latlon.lon,100);

lon_dd = (gps->latlon.lon-lon_mm)/100;

if(gps->latlon.north_lat == ’N’) {

lat_deg=lat_dd+lat_mm/60+lat_ss/3600;

}

else {

lat_deg=-(lat_dd+lat_mm/60+lat_ss/3600);

169

Page 184: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

H. GPS under Linux H.7. Lat/Lon to UTM Algorithm

}

if(gps->latlon.east_lon == ’E’) {

lon_deg=lon_dd+lon_mm/60+lon_ss/3600;

}

else {

lon_deg=-(lon_dd+lon_mm/60+lon_ss/3600);

}

lat = lat_deg*M_PI/180;

lon = lon_deg*M_PI/180;

zone_CM = 6*zone-183;

delta_lon = lon_deg-zone_CM;

nu = a/sqrt(1-pow((e*sin(lat)),2));

//Calcualte Meridional Arc Length

S=A0*lat - B0*sin(2*lat) + C0*sin(4*lat) -

D0*sin(6*lat) + E0*sin(8*lat);

//Calculate constants

p=delta_lon*3600/10000;

sin_1=M_PI/(180*3600);

//Coefficients for UTM coordinates

Ki=S*k0;

Kii=nu*sin(lat)*cos(lat)*pow(sin_1,2)*k0*1e8/2;

Kiii=((pow(sin_1,4)*nu*sin(lat)*pow(cos(lat),3))/24)*

(5-pow(tan(lat),2)+9*e_2*

pow(cos(lat),2)+4*e_2*e_2*pow(cos(lat),4))*k0*1e16;

Kiv=nu*cos(lat)*sin_1*k0*1e4;

Kv=pow(sin_1*cos(lat),3)*(nu/6)*(1-pow(tan(lat),2)+

e_2*pow(cos(lat),2))*k0*1e12;

//Transfer the data from latlon_struct to UTM_struct

gps->UTM.northing=(Ki+Kii*p*p+Kiii*pow(p,4));

gps->UTM.easting=500000+(Kiv*p+Kv*pow(p,3));

gps->UTM.valid = gps->latlon.valid;

gps->UTM.quality = gps->latlon.quality;

gps->UTM.satellites = gps->latlon.satellites;

gps->UTM.dop = gps->latlon.dop;

gps->UTM.time_h = gps->latlon.time_h;

gps->UTM.time_m = gps->latlon.time_m;

gps->UTM.time_s = gps->latlon.time_s;

gps->UTM.time_cs = gps->latlon.time_cs;

gps->UTM.date_d = gps->latlon.date_d;

170

Page 185: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

H. GPS under Linux H.7. Lat/Lon to UTM Algorithm

gps->UTM.date_m = gps->latlon.date_m;

gps->UTM.date_y = gps->latlon.date_y;

gps->UTM.height = gps->latlon.height-gps->latlon.height2;

if(PRINT_INFO){

print_UTM(&gps->UTM);

print_status(&UTM_gps->status);

}

return 0;

}

else

return -1;

}

171

Page 186: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1
Page 187: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Appendix I

Kalmtool v.4.2

Status report on the test, documentation and evaluation of KALMtool foruse in the project.

173

Page 188: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1
Page 189: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

Appendix J

CD

J.1 Contents

The CD contains the following directories. In the root directory of the CD a

thesis See section J.1.1.models See section J.1.2.data See section J.1.3.src See section J.1.4.

file called doxygen.html is situated. It provides easy access to the Doxygendocumentation generated for the developed software. In the documentationan easy overview of the source files is available.

J.1.1 Thesis

This directory contains the thesis.

Files Description

thesis.pdf Thesis in PDF format.kalmtool.pdf Kalmtool appendix in PDF format.

J.1.2 Models

This directory contains the initial verification of the Ackerman and differ-ential models, against the existing.

Files Description

ackerman.m Simulation of the Ackerman model.differential.m Simulation of the differential model.

175

Page 190: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

J. CD J.1. Contents

J.1.3 Data

The real life test results are listed at the top. Simulations for the differentrobots are in separate directories. MATLAB files are included to makeviewing possible.

Directory Description

06042-gps Initial GPS test.28062006 Measurement of parking lot and lawn for test and simulation.15072006 1st Initial test with the MMR in the parking lot.15072006 2st Second test with the MMR in the parking lot.15072006 3st Third test with the MMR in the parking lot.18072006 Test run in Dyrehaven.23072006 Driving with tuned parameters.SMR Simulated data for the Small Mobile Robot.MMR Simulated data for the Medium Mobile Robot.HAKO Simulated data for the HAKO tractor.christian Real life data from Jens Christian Andersen.

J.1.4 Source

All of the source code for the programs is contained in subdirectories. Theon-line documentation is placed in the source directory on the CD.

Directory Description

smrdemo Source files for the Robot Control Program, Sensor Fusionlibrary and the GPS library libgps. The code is situatedin /src/smrdemo/robot2005/src/ and under thefollowing sub directories smrdemo, filter and gps.Documentation is available in the source directories.

simulator Source files and documentation for the Multi Robot Simulator.gpsclient Source files and documentation for the GPS client.UTMgpsd Source files and documentation for the GPS server.iau mat Source files and documentation for the real time matrix library.

The library is the one used for the Kalman filter.opencv demo Source files for the test program used in the matrix

library evaluation. Open Computer Vision implementation.gsl demo Source files for the test program used in the matrix

library evaluation. Gnu Scientific Library implementation.

176

Page 191: Sensor Fusion for Mobile Robots - DTU Electronic …etd.dtu.dk/thesis/191252/oersted_dtu2649.pdf · This master’s thesis on Sensor Fusion for Mobile Robots has been con- ... 5.2.1

J. CD J.2. CD

J.2 CD

177