14
Tracking Drone Orientation with Multiple GPS Receivers Mahanth Gowda University of Illinois (UIUC) Urbana, IL, USA [email protected] Justin Manweiler IBM Research Yorktown Heights, NY, USA [email protected] Ashutosh Dhekne University of Illinois (UIUC) Urbana, IL, USA [email protected] Romit Roy Choudhury University of Illinois (UIUC) Urbana, IL, USA [email protected] Justin D. Weisz IBM Research Yorktown Heights, NY, USA [email protected] ABSTRACT Inertial sensors continuously track the 3D orientation of a flying drone, serving as the bedrock for maneuvers and stabilization. However, even the best inertial measurement units (IMU) are prone to various types of correlated failures. We consider using multiple GPS receivers on the drone as a failsafe mechanism for IMU failures. The core challenge is in accurately computing the relative locations between each receiver pair, and translating these measurements into the drone’s 3D orientation. Achieving IMU-like orientation requires the relative GPS distances to be accurate to a few centimeters – a difficult task given that GPS today is only accurate to around 1-4 meters. Moreover, GPS-based orienta- tion needs to be precise even under sharp drone maneuvers, GPS signal blockage, and sudden bouts of missing data. This paper designs SafetyNet, an off-the-shelf GPS-only system that addresses these challenges through a series of techniques, culminating in a novel particle filter framework running over multi-GNSS systems (GPS, GLONASS, and SBAS). Results from 11 sessions of 5-7 minute flights report median orienta- tion accuracies of 2 even under overcast weather conditions. Of course, these improvements arise from an increase in cost due to the multiple GPS receivers, however, when safety is of interest, we believe that tradeoff is worthwhile. CCS Concepts Computer systems organization Embedded and cyber- physical systems; Keywords Differential GPS; Carrier Phases; UAV; Drones; IMU Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full cita- tion on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or re- publish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]. Mobicom 2016 October 03-07 2016, New York City, NY, USA c 2016 ACM. ISBN 978-1-4503-4226-1/16/10...$15.00 http://dx.doi.org/10.1145/2973750.2973768 1. INTRODUCTION Despite excitement, we are not yet ready for a world of drones flying among people. Challenges of liability, regula- tion, and limited societal acceptance all arise because drones are still too dangerous for responsible use in public places. A capstone obstacle is the untrustworthiness of inertial sen- sors [1–5]. A drone’s inertial sensors, also called the Inertial Measure- ment Unit (IMU), is akin to a human’s inner ear. Without a good awareness of orientation and balance, a drone cannot effectively modulate current to its rotors – even hovering becomes impossible. In the best case, the drone may not take off, or fall vertically from the sky. However, the worse case is when the drone tries to correct for its poor sensory inputs, and accelerates uncontrollably in an unintended direction with even greater force. For a package delivery drone mea- sured in 10s of kilograms, this might be deadly. A natural response is to install redundant IMUs (i.e., ac- celerometer, gyroscope, and compass). Given that each sensor is relatively small, light, and inexpensive, this is an easy fix – some commercial drones have already adopted this [6, 7]. Unfortunately, redundancy only addresses unre- liable hardware. Correlated noise sources, including motor vibration, electromagnetic interference, and ambient ferro- magnetic influences, remain as serious concerns [8–12]. To provide stronger IMU failover guarantees, we consider a GPS based approach completely orthogonal to the fundamental nature of IMU sensing. We install 4 GPS receivers at the corners of a single drone – upon IMU failure, we utilize these GPSs to estimate the drone’s 3D orientation. Put differently, we apply GPS to estimate pitch, roll, and yaw, without any inertial or magnetometer assistance. The drone’s pitch, roll, and yaw can be expressed as a func- tion of the pairwise 3D vectors between the GPS receivers – as the drone flies, these vectors change with respect to the Earth’s reference frame. SafetyNet’s core task is to precisely estimate these 3D vectors. To achieve IMU-like accuracy (1 ), the vectors need to be estimated at centimeter scale precision. This precision needs to be upheld constantly over time, across agile flight patterns, poor satellite SNR, and even short periods of missing data.

Tracking Drone Orientation with Multiple GPS Receiverssynrg.csl.illinois.edu/papers/drone-attitude-camera.pdf · a flying drone, serving as the bedrock for maneuvers and stabilization

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Tracking Drone Orientation with Multiple GPS Receiverssynrg.csl.illinois.edu/papers/drone-attitude-camera.pdf · a flying drone, serving as the bedrock for maneuvers and stabilization

Tracking Drone Orientation with Multiple GPS Receivers

Mahanth GowdaUniversity of Illinois (UIUC)

Urbana, IL, [email protected]

Justin ManweilerIBM Research

Yorktown Heights, NY, [email protected]

Ashutosh DhekneUniversity of Illinois (UIUC)

Urbana, IL, [email protected]

Romit Roy ChoudhuryUniversity of Illinois (UIUC)

Urbana, IL, [email protected]

Justin D. WeiszIBM Research

Yorktown Heights, NY, [email protected]

ABSTRACTInertial sensors continuously track the 3D orientation ofa flying drone, serving as the bedrock for maneuvers andstabilization. However, even the best inertial measurementunits (IMU) are prone to various types of correlated failures.We consider using multiple GPS receivers on the drone asa failsafe mechanism for IMU failures. The core challengeis in accurately computing the relative locations betweeneach receiver pair, and translating these measurements intothe drone’s 3D orientation. Achieving IMU-like orientationrequires the relative GPS distances to be accurate to a fewcentimeters – a difficult task given that GPS today is onlyaccurate to around 1-4 meters. Moreover, GPS-based orienta-tion needs to be precise even under sharp drone maneuvers,GPS signal blockage, and sudden bouts of missing data. Thispaper designs SafetyNet, an off-the-shelf GPS-only system thataddresses these challenges through a series of techniques,culminating in a novel particle filter framework running overmulti-GNSS systems (GPS, GLONASS, and SBAS). Resultsfrom 11 sessions of 5-7 minute flights report median orienta-tion accuracies of 2◦ even under overcast weather conditions.Of course, these improvements arise from an increase in costdue to the multiple GPS receivers, however, when safety is ofinterest, we believe that tradeoff is worthwhile.

CCS Concepts•Computer systems organization → Embedded and cyber-physical systems;

KeywordsDifferential GPS; Carrier Phases; UAV; Drones; IMU

Permission to make digital or hard copies of all or part of this work for personal orclassroom use is granted without fee provided that copies are not made or distributedfor profit or commercial advantage and that copies bear this notice and the full cita-tion on the first page. Copyrights for components of this work owned by others thanACM must be honored. Abstracting with credit is permitted. To copy otherwise, or re-publish, to post on servers or to redistribute to lists, requires prior specific permissionand/or a fee. Request permissions from [email protected] 2016 October 03-07 2016, New York City, NY, USA

c©2016 ACM. ISBN 978-1-4503-4226-1/16/10...$15.00http://dx.doi.org/10.1145/2973750.2973768

1. INTRODUCTIONDespite excitement, we are not yet ready for a world ofdrones flying among people. Challenges of liability, regula-tion, and limited societal acceptance all arise because dronesare still too dangerous for responsible use in public places.A capstone obstacle is the untrustworthiness of inertial sen-sors [1–5].

A drone’s inertial sensors, also called the Inertial Measure-ment Unit (IMU), is akin to a human’s inner ear. Without agood awareness of orientation and balance, a drone cannoteffectively modulate current to its rotors – even hoveringbecomes impossible. In the best case, the drone may not takeoff, or fall vertically from the sky. However, the worse caseis when the drone tries to correct for its poor sensory inputs,and accelerates uncontrollably in an unintended directionwith even greater force. For a package delivery drone mea-sured in 10s of kilograms, this might be deadly.

A natural response is to install redundant IMUs (i.e., ac-celerometer, gyroscope, and compass). Given that eachsensor is relatively small, light, and inexpensive, this is aneasy fix – some commercial drones have already adoptedthis [6, 7]. Unfortunately, redundancy only addresses unre-liable hardware. Correlated noise sources, including motorvibration, electromagnetic interference, and ambient ferro-magnetic influences, remain as serious concerns [8–12].

To provide stronger IMU failover guarantees, we consider aGPS based approach completely orthogonal to the fundamentalnature of IMU sensing. We install 4 GPS receivers at thecorners of a single drone – upon IMU failure, we utilize theseGPSs to estimate the drone’s 3D orientation. Put differently,we apply GPS to estimate pitch, roll, and yaw, without anyinertial or magnetometer assistance.

The drone’s pitch, roll, and yaw can be expressed as a func-tion of the pairwise 3D vectors between the GPS receivers –as the drone flies, these vectors change with respect to theEarth’s reference frame. SafetyNet’s core task is to preciselyestimate these 3D vectors. To achieve IMU-like accuracy(≈ 1◦), the vectors need to be estimated at centimeter scaleprecision. This precision needs to be upheld constantly overtime, across agile flight patterns, poor satellite SNR, and evenshort periods of missing data.

Page 2: Tracking Drone Orientation with Multiple GPS Receiverssynrg.csl.illinois.edu/papers/drone-attitude-camera.pdf · a flying drone, serving as the bedrock for maneuvers and stabilization

Recent research [13, 14] has improved on Differential GPS(DGPS) to achieve ≈ 15cm error for relative localization.This is adequate for many on-road vehicular applications,such as lane change detection, peer-to-peer car coordination,etc. However, the accuracy requirement for drone orientationis much higher. Multi-GPS orientation estimation techniqueshave been proposed for ships, planes and low dynamic droneflights [15–17]. However, extending them to commercialdrones of small form factor and rapidly changing orientationis non-trivial and challenging. The demands are far stricter,both in terms of accuracy and time-granularity of track-ing. Finally, the estimation techniques must be lightweight tobe able to serve as a real-time IMU replacement during flights.

SafetyNet appropriately adopts ideas and techniques from themature GPS literature, and builds over them to mitigate thechallenges. The core additions, although closely interspersedwith existing techniques, can be distilled as follows:

(1) Manipulating measurements across pairs of GPS receivers,satellites, and consecutive time points, to ultimately capturethe 3D orientation of the drone from 2 different perspectives.Using these 2 perspectives to formulate an estimation prob-lem, amenable to Kalman Filtering (KF). While GPS literatureis fraught with KF and measurement manipulations (called“double differentials”), SafetyNet combines the double differ-entials in a way that is novel to the best of our knowledge.

(2) GPS phase measurements are composed of an unknownportion, called “integer ambiguity”. In attempting to resolvethis ambiguity, past work have adopted techniques akin to“hard decoding”, where the most likely state-estimate ispropagated across time. We design the equivalent of “softdecoding”, whereby top-K possibilities of the ambiguity arepropagated, each associated with an inferred probability. Aparticle filter is used to execute this idea – the particle filterdegenerating back into the Kalman Filter when the ambiguityis resolved confidently.

(3) We break away from the classical particle filter approachand “adjust” the state of the particles based on availablemeasurements. This speeds up convergence of the system,while requiring fewer particles (considerably reducing thecomputational complexity). As a result, the overall SafetyNetsystem lends itself to real time operation on today’s dronehardware.

(4) Finally, SafetyNet is a complete system borne out ofsignificant engineering effort, including carrier-phase outlierdetection, integration of multi-satellite systems for robust-ness, automatic baseline calibration, etc.

Our evaluation platform is composed of a 3DR octocopter (8rotor) mounted with 4 off-the-shelf NEO-M8T GPS receivers,all connected to a Raspberry Pi via USB. A GoPro camerais mounted in the underbelly of the drone, facing verticallydownward – vision data serves as the ground-truth for bothIMU and GPS. We present extensive experimental data from11 aggressive flight sessions, performed under a wide rangeof weather conditions. Critically, our results demonstratecomparable accuracy to IMU even under the most aggressiveaerial maneuvers within the capability of our drone.

Some Natural QuestionsIs power consumption excessive with 4 GPS receivers?While GPS receivers are indeed considered power hungry onsmartphones [18,19], we note that the drone’s physical flightrequires 1000x more power, requiring a separate LiPO bat-tery [20] weighing 804 grams. Hence the marginal increasein power from GPS is negligible.

Do GPS receivers add to the cost of drones? SafetyNetis mostly needed for bigger drones carrying heavier pay-loads (costly cameras, delivery packages). The cost of suchdrones and their payloads easily justify the additional GPScost. Moreover, commercial drones already provide multipleGPSes for redundancy [21] – we believe the additional GPScost should not be an issue.

What is the on-board IMU’s orientation accuracy? Howdoes it compare to SafetyNet? Results in this paper willdemonstrate comparable accuracy distributions betweenSafetyNet and the on-board IMU, even at the 99th percentile.To be specific, pitch and roll are slightly worse with Safe-tyNet, but yaw is slightly better. This level of accuracy isadequate as a fallback mechanism – upon IMU failure, thedrone could avoid aggressive maneuvers and fly back safelyto its base.

What if GPS also fails? Crash reports typically indicate IMUfailures [4]. This is because engine vibrations cause drift inIMU sensors, accumulating error over time [8,11,12]. Droneelectronics might interfere with magnetometers [9, 10, 22].However, none of these error sources affect GPS. While GPScould still fail due to deliberate jamming or extreme weatherconditions, the possibility of both failing is naturally lower.

The rest of the paper expands on these techniques, ex-perimental findings, and contributions. We begin with anabridged technical primer on GPS processing (mostly derivedfrom Differential GPS and related techniques), followed bysystem design, error mitigation, and an end-to-end systemevaluation.

2. GPS FOUNDATIONSWe present GPS foundations from first principles and endwith discussions on modern techniques. As a result, this sec-tion is long. However, given that this paper builds over coreGPS algorithms, the material is necessary. We also believe thematerial is easy to follow.

2.1 Global Navigation Satellite System (GNSS)GNSS is the generic name given to satellite systems that pro-vide localization services to receiver’s on earth. The GlobalPositioning System (GPS) [23] is one example of a GNSS,developed by the US Government during 1970-80s. GPSconsists of a constellation of 31 satellites orbiting the earthat a height of 20,000 km. The satellites are simultaneouslyand continuously transmitting unique pseudo-random noise(PRN) sequences using CDMA at 2 different frequencies –1575.42 (L1) and 1227.60 MHz (L2). They also broadcastephemeris data using which the (satellite) position and timeof transmission can be calculated. A GPS receiver on theground localizes itself by trilateration, i.e., measuring and

Page 3: Tracking Drone Orientation with Multiple GPS Receiverssynrg.csl.illinois.edu/papers/drone-attitude-camera.pdf · a flying drone, serving as the bedrock for maneuvers and stabilization

combining the time-of-flight (ToF) of PRNs from differentsatellites. Velocity is computed from the doppler shifts fromeach of the satellites.

GLObal NAvigation Satellite System (GLONASS) is anotherGNSS system launched by Russia in the 1980’s [24]. Similarly,GALILEO [25] is an European GNSS system currently underdevelopment. Many GNSS receivers are capable of decodingsignals from multiple GNSS systems, providing increased ac-curacy. This paper will also use such receivers and exploit theadvantages of satellite diversity.

2.2 GPS Localization and Error SourcesA GNSS receiver on the ground can compute its 3D loca-tion, time, and velocity. The key idea is to measure variousattributes of the arriving signal (e.g., time of flight, phase,etc.), and then apply statistical algorithms to estimate the er-rors and ambiguities in measurements. We describe below anoverview of the techniques that underpin GPS; other GNSSsystems rely on similar techniques.

PseudorangeWhen a satellite transmits a signal, it includes the startingtime of the transmission (obtained from its atomic clock).The ground receiver records the time of reception also usingits less accurate local clock. The time-of-flight (ToF) is thedifference between these timestamps. When multiplied bythe speed of light, the result gives the rough distance to thesatellite, called pseudorange.

Pseudorange = ToF * (speed of light)

Of course, the measured ToF is inaccurate because the clockof a typical GPS receiver is not synchronized to the GPS satel-lites. The resulting error can be up to 300 km. In addition,when the GPS signal enters the Earth’s atmosphere, it canget delayed due to refractions in the Ionosphere and Tropo-sphere. A signal also passes through a multipath channel,adding more errors. Assuming the true range between a satel-lite s and receiver i is ρsi , the measured Pseudorange P si , in-clusive of all error sources, can be modeled as

P si = ρsi + cti − cts +A+Mi + εsi (1)

Here, ti and ts are receiver and satellite clock biases, respec-tively, with respect to true time. A represents the range errordue to refractions in the atmosphere. Ms

i denotes Multipath,εsi is receiver’s hardware noise and c is the speed of light.In today’s systems, the satellite clock error ts is small1, andcti proves to be the major source of error. Hence, cti ismodeled as an unknown, and ρsi is written as a functionof the unknown 3D receiver location ρi and the known 3Dsatellite position ρs. This results in a total of 4 unknowns.Once we have a lock with 4 satellites, the time bias and the3D locations can be jointly estimated resulting in a positionfix. The ignored error sources, i.e., ts, A, M, and ε, couldcontribute to an error of 1-4 meters, depending upon theenvironmental conditions.

GPS receivers on phones and car dashboards use the abovetechniques. However, higher accuracy applications such as 3D1Satellites estimate the errors themselves from mutually ex-changed signals as well as from ground sources.

orientation tracking require better performance. To this end,modern GPS research has leveraged the phase of the arrivingsignals, as detailed next.

Carrier PhaseOnce a satellite lock is acquired, the phase of the arrivingsignal, φsi , is constantly tracked by a phase lock loop (PLL).The true range between the satellite and the receiver, ρsi , canbe expressed as a multiplicative factor of wavelength λ.

ρsi = λNsi + λφsi (2)

Nsi is an unknown integer, meaning that the PLL measure-

ment of φsi only captures the fractional part of the range.However, due to atmospheric effects, multipath, and clock is-sues, the above equation can be updated:

λφsi = ρsi + cti − cts +A+Mi + εi − λNsi (3)

Estimating Nsi is non-trivial and several algorithms have been

proposed [26,27]. We discuss more in Sec. 5.1.

One advantage of carrier phase is that its changes over timecan be tracked reliably by utilizing the doppler shift in thesignal [28]. Hence, φsi (t2) takes the same mathematical formas Equation 3. Thus, if the initial value ofNs

i can somehow beestimated, the tracking thereafter can be good. We now ex-plain how today’s systems like Differential GPS (DGPS) withthis integer ambiguity, Ns

i , and other error sources.

2.3 Computing DifferentialsEnvironmental error sources in Equation 3 are correlated overshort time periods and within small geographical areas (200km). Thus, two GPS measurements across time can be sub-tracted (or differenced) to eliminate some of these factors.Similarly, simultaneous measurements from multiple GPS re-ceivers can also be differenced. Differential GPS (DGPS) [29]performs such operations on pseudoranges, while Real-TimeKinematic (RTK) [30] applies differentials to carrier phase.For our purpose of precise orientation tracking, the latter ismore relevant. To this end, we outline 4 kinds of carrier phasedifferentials.

(1) Single Differentials across Receivers (SDij)Consider the carrier phase equations for two GPS receivers iand j from the same satellite s (let us ignore multipath andnoise).

λφsi = ρsi + cti − cts +As − λNsi (4)

λφsj = ρsj + ctj − cts +As − λNsj (5)

Differencing the above two equations yields the relative po-sition between i and j with fewer error terms. Correlatederror sources of atmospheric delays and satellite clock biasesdisappear.

λ∆φsij = ∆ρsij + c∆tij − λ∆Nsij (6)

Figure 1 illustrates the scenario. Assuming that the satelliteis far away, ∆ρsij can be approximated as ρijCosθ, where ρijis the true relative position between i and j (called baselinevector). Replacing ρijCosθ as a vectorial projection of ρij onto the line of sight unit-vector l̂s of satellite s, we have:

λ∆φsij = ρij .l̂s + c∆tij − λ∆Nsij (7)

Page 4: Tracking Drone Orientation with Multiple GPS Receiverssynrg.csl.illinois.edu/papers/drone-attitude-camera.pdf · a flying drone, serving as the bedrock for maneuvers and stabilization

While some errors have disappeared 2, a function of the clockbias errors, c∆tij still remains, motivating the need for dou-ble differentials.

𝜃i j

𝜌𝑖𝑗

s

Figure 1: ρsi − ρsj = ∆ρsij = ρij .l̂s

(2) Double Differentials across Receivers andSatellites (DDsk

ij )Given multiple satellites in range, the GPS ground receiversi and j can perform the same measurements with satellite k.Equation 7 can then be rewritten as:

λ∆φkij = ρij .l̂k + c∆tij − λ∆Nkij (8)

Subtracting the single differential equations 7 and 8, we havea double differential (DD) as follows:

λ∇∆φskij = ρij .(l̂s − l̂k)− λ∇∆Nskij (9)

The double differential (DD) eliminates the clock biases andthe residue is only the integer ambiguity terms. We will dis-cuss the resolution of integer ambiguity in Section 5.1, but as-suming that the ambiguities are magically fixed, this providesus a reasonably precise estimate of relative positions (calledbaselines) between receiver pairs. Ignored factors, includingmultipath, noise, and antenna phase center errors, add upto a few centimeters of error. Thus, if one of the receiver’sabsolute position is known in the granularity of millimeters,the absolute position of the other receiver can be estimatedprecisely as well. Real Time Kinematics (RTK) technology op-erates exactly as above – it uses the accurately known locationof the reference receiver to calculate the location of the other.Figure 2 shows the relative distance between two receiversplaced roughly 45 cm apart. Differencing techniques convinc-ingly outperform naive subtraction of 3D GPS positions.

(3) Single Differentials across Time (SDt12)Similar to differentials across receivers and satellites, we canalso perform differentials across time for the same receiver.This can eliminate the integer ambiguity as follows:

λφsi (t1) = ρsi (t1) + cti(t1)− cts(t1) +As − λNsi (t1) (10)

λφsi (t2) = ρsi (t2) + cti(t2)− cts(t2) +As − λNsi (t2) (11)

2We are aware that differentials can amplify noise and multi-path, however, since carrier phase noise is in the granularityof few mm [31], we ignore noise in the rest of the paper. Wealso assume multipath is not excessive, such as the drone fly-ing low in Manhattan-like areas.

0 20 40 60 80 100

Time(s)

0

200

400

600

800

Dis

tan

ce

(cm

)

Carrier Phase

Location Difference45 cm

Figure 2: Relative positioning using carrier phase anddouble differentials is an order of magnitude more ac-curate than naïve location differencing. Here, a 45 cmbaseline is correct within a few cm.

When no cycle slips occur (i.e., Nsi (t1) = Ns

i (t2)) and giventhat the satellite clock bias is well known and hardly changesin small time intervals [t1, t2], we have

λ∆φsi (t12) = ∆ρsi (t12) + cti(t12) (12)

Since the distance to the satellite is very large compared tothe displacement of the receiver during [t1, t2], the satellitelocation can be assumed fixed for this interval, incurring neg-ligible errors with this approximation. The motion of the re-ceiver is then represented in Figure 3. As described earlier,∆ρsi (t12) can now be expressed as the projection of the rela-tive motion vector ρi(t12) on to the line-of-sight unit vector l̂sof the satellite. Thus, Equation 12 becomes:

λ∆φsi (t12) = ρi(t12).l̂s + cti(t12) (13)

𝜃𝒕𝟏 𝒕𝟐

𝜌𝑖(𝑡12)

s

Figure 3: ρsi (t1)− ρsj(t2) = ∆ρsij(t12) = ρsij(t12).l̂s

Observe that ρsi (t12) (the relative displacement of the receiverduring [t1, t2]) and the receiver’s clock bias ti(t12) add upto 4 unknowns. Measurements across 4 satellites can helpjointly estimate the relative displacement and clock bias. Thisestimated relative displacement is quite accurate since the in-teger ambiguityNs

i does not pollute it. We verified this with astatic receiver placed on the ground; the estimation resultedin about 1 cm/s of motion, far more accurate than velocityestimations from Doppler shifts.

Page 5: Tracking Drone Orientation with Multiple GPS Receiverssynrg.csl.illinois.edu/papers/drone-attitude-camera.pdf · a flying drone, serving as the bedrock for maneuvers and stabilization

(4) Double Differentials across Receivers andTime (DDt12

ij )Our final double differential combines receivers and time.We compute the single differential between receivers ij fromEquation 7 but write them for consecutive time points t1 andt2.

λ∆φsij(t1) = ρij(t1).l̂s + c∆tij(t1) + λ∆Nsij(t1) (14)

λ∆φsij(t2) = ρij(t2).l̂s + c∆tij(t2) + λ∆Nsij(t2) (15)

Assuming no cycle slips (we will relax this assumption later),∆Ns

ij(t2) = ∆Nsij(t1), hence subtracting the above two equa-

tions eliminates integer ambiguity:

λ∇∆φsij(t12) = (ρij(t1)− ρij(t2)).l̂s + c.∆tij(t12) (16)

One may physically interpret this equation as the subtractionof two vectors, where the first vector is a drone baseline attime t1 and the second vector is the same baseline at t2. Inother words, this captures the relative motion of the droneacross time.

2.4 The Bigger PictureWe take-away 2 key points from the discussion on the differ-entials:

• The double differentials across receivers and satellites(DDsk

ij ) yields the drone’s baseline vectors at any given timepoint (Equation 9). However, this estimate is still pollutedby integer ambiguity.

• The double differential across receivers and time (DDt12ij )

yields the drone’s relative change in the baseline vectors dur-ing flight (Equation 16). Importantly, this relative estimate isfree of the integer ambiguities.

Thus, we now have two separate estimates of the drone’s3D baseline vectors, each with different error properties.SafetyNet recognizes the opportunity of combining thesetwo noisy estimates to precisely track the drone baselines,ultimately tracking orientation.

3. SYSTEM MODELSafetyNet will model orientation tracking as a state estima-tion problem, where the state is defined as 3D orientation.We formally define “3D orientation” first and then design themodel.

Figure 4 pretends 4 GPS receivers have been placed on adrone – their locations denoted as ρ1, ρ2, ρ3 and ρ4. Thebaseline vectors joining one of the receivers (say ρ4) to theothers can be defined as ρij = ρi − ρj . When the “baselines”are aligned with the North-East reference axes, the baselinematrix Bo can be written as:

Bo = [ρ41 ρ42 ρ43] (17)

Assuming that the magnitude of the baseline vectors are d1and d2, we can expand Bo as:

Bo =

d1 0 d10 d2 d20 0 0

(18)

Figure 4 also shows the rotation conventions of the 3 Eulerangles – pitch, roll and yaw. Applying these rotations on Bo

GPS4 GPS1

GPS2 GPS3

d1

d2

Roll

Yaw

Pitch

Drone’ Heading Direction

North

East

Figure 4: The drone baseline vectors ρ41 and ρ42 alignedwith Earth’s reference frame.

will obviously yield the new baseline matrix, B. For all ourresults, we will express rotations in terms of the Euler angles(i.e., degrees), which are intuitive to understand. However,for the purpose of mathematical efficiency, we will use quater-nion mathematics, an alternative representation to Euler an-gles. Briefly, the baselines at an arbitrary orientation, calledquaternion q, can be expressed in terms of the initial orienta-tion qo (aligned with reference axes), as below:

ρij(q) = A(q)′ρij(qo) (19)

Here A(q) is the rotation matrix associated with the orienta-tion quaternion q. Similarly, extending the effect of rotationsto the entire baseline matrix, we have:

B(q) = A(q)′Bo (20)

Effectively, orientation is about estimating the rotation matrixA(q) using projected measurements of B(q) on various satel-lite directions. Details on conversion between quaternions,Euler angles and rotation matrices can be found in [32]. Re-gardless of these mathematical conversions, the core concep-tual question still pertains to estimating the matrix B at anygiven time.

4. SYSTEM DESIGN: PHASE 1We adopt a Bayesian filtering approach for tracking drone ori-entation. Figure 5 shows the model: (1) A state transitionfunction, derived from the incremental changes in orienta-tion (DDt12

ij ), models the next state of the drone. Recall thatthese changes are affected by the hardware noise and mul-tipath errors. (2) A measurement function, (DDsk

ij ), reflectsthe absolute orientation of the drone at any given time. Ofcourse, this measurement is polluted by integer ambiguity.We adopt a Kalman Filter to combine the uncertainties fromthe transition and measurement functions, and track the mostlikely state of the system through time. We describe this ba-sic design first. Then we focus on resolving the error sources(such as integer ambiguity, cycle slips, and missing data), andredesign the framework to accommodate these optimizations.Our final design is an “adjusted” particle filter algorithm thattracks drone orientation with consistent accuracy.

4.1 State Transition ModelThe relative baseline changes over time are directly obtained

Page 6: Tracking Drone Orientation with Multiple GPS Receiverssynrg.csl.illinois.edu/papers/drone-attitude-camera.pdf · a flying drone, serving as the bedrock for maneuvers and stabilization

t1 [P1 Y1 R1]

t2 [P2 Y2 R2]

t3 [P3 Y3 R3]

State Transition: Relative ( DDij

t12 )

Measurement:

Absolute ( DDijsk)

State: [P R, Y] Pitch, Roll, Yaw

Figure 5: Bayesian filtering approach to tracking orienta-tion state over time.

from Equation 16, copied for convenience:

λ∇∆φsij(t12) = (ρij(t1)− ρij(t2)).l̂s + c.∆tij(t12) (21)

Omitting details, we rewrite with quaternions:

λ∇∆φsij(t12) = ρij(qo)bA(q1).l̂s×cδθ + c.∆tij(t12) (22)

where, q1 is the orientation quaternion at time t1, δθ is the ro-tation vector [33] associated with quaternion δq. b ×c is thevector cross operator [34]. We now solve Equation 22 for var-ious satellites s and GPS receiver-pairs ij using least-squaresestimation. The result yields an estimate of the rotation vec-tor δθ (hence δq) between two time points t1 and t2. We canthus estimate the new orientation quaternion q2 as:

q2 = δq ⊗ q1 (23)

Here, ⊗ is the quaternion multiplication operator. Whentranslated back to Euler angles, the result is the relativeorientation change – pitch, yaw, and roll – from one state tothe next. As mentioned earlier, this estimate is polluted byhardware noise and multipath.

4.2 Absolute Orientation MeasurementEquation 9 showed that double differentials across receiversand satellites (DDsk

ij ) are estimates of the absolute baselinevectors of the drone. While they cancel out clock bias errors,they leave the integer ambiguities as follows:

λ∇∆φskij = ρij .(l̂s − l̂k) + λ∇∆Nskij (24)

Translating to quaternions again, using very similar conven-tions as described earlier, we have:

λ∇∆φskij − ρij(qo).A(qn).(l̂s − l̂k) =

ρi,j(qo).bA(qn).(l̂s − l̂k)×cδθ + λ.∇∆Nskij

(25)

We develop techniques for resolving integer ambiguities inSection 5.1. For now, let’s assume they are resolved – then,we are left with a set of linear equations over different satel-lite pairs sk and baselines ij. By solving them using standardLeast Squares Estimation (LSE), the rotation vector δθ andassociated quaternion δq can be obtained. Hence the orienta-tion q is.

q = δq ⊗ qn (26)

Here qn is an initial orientation estimate for the purposes oflinearization of Equation 25 (usually comes from the tran-sition model). Note that this estimate q is absolute in the

earth’s reference frame, since the satellite locations sk areboth known in that reference frame.

4.3 Applying Extended Kalman Filter (EKF)Kalman filter is an estimation algorithm to systematicallycombine erroneous observations about a system state. In ourcase, since the transition function is intrinsically non-linear(because of the conversion from rotation matrices to quater-nions), we make linearized approximations using ExtendedKalman Filter (EKF) [35]. We briefly sketch the methodologyhere.

Transition: The state transition function in Equation 22 canbe re-written in a generic non linear model as:

ˆqk+1 = f(qk) + wk (27)

Here, k and k + 1 are two successive time points, f is of the(non-linear) form:

f = δq ⊗ qk (28)

and wk is the estimated process noise derived from the GPSsignal SNR. δq is the quaternion of δθ in Equation 22. Now,linearizing Equation 27 around the estimated rotation vectorassociated with quaternion qk, we have:

δθk+1 = F.δθk + wk (29)

where F is given by ∂f∂θ| ˆθk+1

Measurement: As with the transition function above, themeasurement function can be Equation 25, written in the fol-lowing form:

y − yo = H.δθk+1 + vt (30)

where k used for both satellite and sample index. vt is themeasurement noise derived from GPS SNR.

y = λ∆φskij − λ.∆Nskij (31)

yo = ρij(qo).A( ˆqk+1).(l̂s − l̂k) (32)

H = ρij(qo).bA( ˆqk+1).(l̂s − l̂k)×c (33)

Equations. 29 and 30 can now be combined using an Ex-tended Kalman Filter for a refined estimate of δθk+1. Its as-sociated quaternion δqk+1 can then be used to estimate themost likely orientation quaternion qk+1:

qk+1 = δqk+1 ⊗ ˆqk+1 (34)

While the noise terms of Transition and Measurement modelsare not completely independent, we introduce Particle Filterslater (Section 5.3) to lift the Gaussian assumption of KalmanFilters.

5. RESOLVING AMBIGUITIES: PHASE 2The above EKF has been designed under the convenientassumption that integer ambiguity and cycle slips have beenresolved. However, this resolution is challenging, as evidentfrom the numerous papers written on this topic [36–39].Since these error sources seriously affect drone orientation,we comprehensively address them next.

Page 7: Tracking Drone Orientation with Multiple GPS Receiverssynrg.csl.illinois.edu/papers/drone-attitude-camera.pdf · a flying drone, serving as the bedrock for maneuvers and stabilization

5.1 Resolving Integer AmbiguityRecall from Equation 2 that the Phase Lock Loop (PLL) onlymeasures the fractional part of the range, leaving an un-known of λN . Once the drone is flying, additional multiplesof wavelengths can also accumulate, called cycle slips. Wefirst describe ways to mitigate the initial λN and discuss cycleslips thereafter.

Note that we do not need to estimate N since we are notcomputing the actual range (or location) of the GPS receiver.Since we only care about orientation, we have been oper-ating in the space of differentials. Thus, when we computeNsij = Ns

i −Nsj , followed by Nsk

ij = Nsij −Nk

ij , we recognizethat Nsk

ij are also integers. Our goal is to estimate these Nskij

integers, for all combinations of ij and sk.

Our core intuition is to compute a Cos function on the valueof 2πNsk

ij ; since Nskij should be an integer, Cos(2πNsk

ij )should equal 1. Now, let’s say this Cos function is summedup over all the possible values of Nsk

ij , then we have∑ij,sk Cos(2πN

skij ). If there are ψ possible tuples of

< ij, sk >, then the summation should add up to ψ. Now,this summation can be rewritten as a function of orientationq as follows:

cos(2πNskij ) = cos(2π

λφskij − ρij(qo).A(q).(l̂s − l̂k)

λ) (35)

Here Nskij is derived from a combination of Equations 9 and

19. Then,

M(q) =∑ij,sk

cos(2πNskij ) (36)

In an ideal case, the correct q should make the RHS of Equa-tion 36 equal to ψ. However, given that phase φskij in Equation35 has errors, for a given q, the corresponding values of Nsk

ij

are not all 1. When these Nskij are plugged into Equation 36,

the value is less than ψ. Hence, we search across all values ofq at the granularity of 5 degrees and pick a qcoarse, such that

M(qcoarse) = maxqM(q) (37)

5 degrees is not an arbitrary design choice. Rather, 5 de-gree is a function of GPS signal wavelength and the diago-nal of the drone, together ensuring that the estimated valuesof Nsk

ij do not over or undershoot an integer by more than0.2 with high probability. So long the offset is less than 0.5,the actual Nsk

ij values can be estimated by rounding off. Theequation below shows our final estimation of integer ambigu-ity from qcoarse.

N̂skij = b

λφskij − ρij(qo).A(qcoarse).(l̂s − l̂k)

λ+ 0.5c (38)

Our results show that when the drone is static and readyto take off, the estimates of Nsk

ij are correct (also, the Cosfunction is close to 1). However, once the drone starts flying,and especially after sharp maneuvers, integer ambiguityagain builds up. This is because poor SNR, exacerbated bysudden phase changes in the BPSK GPS signal, causes the PLLto lose lock. The result is both full and half cycle slips3. Now,even if qcoarse correctly estimates a large fraction of Nsk

ij , the3Half cycle slips occur because some PLLs square the signal toremove BPSK messages; squaring makes the correlator match

small erroneous fraction influences subsequent estimationand the orientation diverges over time. Moreover, half cyclesalso increase the search space by 8x since the search is 3-dimensional (yaw, pitch, roll). Thus, coarse grained attitude,qcoarse, can no longer be used for consistent tracking.

Fig.6 compares the cycle slip percentage for static and flyingdrones – we use the view from the drone camera as theground truth [40] (described later in Section 6). Evidently,cycle slips occur much more frequently (almost once everysecond) when the drone is flying. This means that even ourassumption in Equation 12 – that Ns

i (t2) = Nsi (t1) – is not

necessarily true, and can drastically pollute estimates. To thisend, SafetyNet approximates qcoarse for Equation 38 from theestimate of the State Transition Model in Section 4.1. Whilethis is not perfect, the Outlier detection and Particle Filtertechniques discussed next will help absorb most errors.

0 200 400 600

Time (s)

0

20

40

60

80

cycl

e sl

ips(

%)

Static Drone Flying Drone

TakeOff

Figure 6: Cycle slips occur more often during flight.

5.2 Cycle Slip DetectionNeedless to say, our cycle slip detection scheme has to bequick (to be able to run every few seconds). Towards thisgoal, we form temporal differential measurements frommultiple satellites as shown in Equation 13. After compen-sating for known satellite biases ts(t12), we are left with fourunknowns – 3D change in position ρi(t12) and clock biasdifference ti(t12). Solving these using Robust Least Squares(RLS) [41] will typically isolate the outliers, with Error Resid-uals for the inliers being less than 2cm. Our transition model(Section 4.1) incorporates corrections from this module.While there is a possibility that RLS may not converge, weuse GLONASS satellites to boost the redundancy.

Unlike GPS satellites, GLONASS uses FDMA, i.e., satellitestransmit in different frequencies. Since the wavelengths arenow λs and λk from satellites s and k, the integer ambigu-ity terms do not group themselves into a single integer in theDDsk

ij double differentials. Hence, we use GLONASS only forDDt12

ij , since the signals from the same satellite cancel acrosstime, t12. However, when even GLONASS+GPS fails (perhapsbecause of too few satellites under poor weather), we resortto Particle Filters (Section 5.3).

5.3 Unified Particle Filter Framework

the original signal twice within a signal period, indicatingphase change. However, this squaring effectively halves thewavelength, meaning that Nsk

ij is now multiples of 0.5.

Page 8: Tracking Drone Orientation with Multiple GPS Receiverssynrg.csl.illinois.edu/papers/drone-attitude-camera.pdf · a flying drone, serving as the bedrock for maneuvers and stabilization

The net result of all the above techniques can be summarizedas follows: the orientation q, and hence the values of Nsk

ij , aremostly correct, and the Kalman filter can track it well. How-ever, occasionally q is incorrect, and this error accumulates overtime causing complete divergence. Figure 7 illustrates an ex-ample of the drift with Kalman Filters (KF). We bring Parti-cle Filters to better handle these cases since multiple parti-cles (i.e., orientation states) can be propagated through time,while relaxing linearity and Gaussian assumptions requiredfor KF. Thus, SafetyNet adopts a hybrid approach, i.e., thesystem uses Kalman Filters in general but invokes a particlefilter when:

• Orientation estimates have poor confidence based onthe summed Cos metric in Equation 36.

• Missing data in one or more of the receivers derails thestate transition function.

0 10 20 30 40 50

Time(s)

-20

0

20

40

Ro

ll (d

eg

)

Vision

KF

APF

Figure 7: KF based orientation drifts away due to poorSNR or missing data.

Figure 8 summarizes the flow of events. When the confidenceon q (based on the Cosmetric) is high, SafetyNet continues torun the Kalman filter. When low, we select K values of orien-tation q – the values whose corresponding Cos metric ranksin the top-K. We initialize K particles at each of these orien-tation states and update each particle based on the transitionand measurements functions. These particles are essentially“trackers” of different integer ambiguities, and we propagatethem until the Cos metric for one becomes high. Now, if theparticles are too close to each other – within 3◦ – we mergethese particles into one. If a particle exhibits extremely lowweight, we also remove that particle. If merging or removalleaves one particle, SafetyNet switches back to Kalman filter-ing again. Else, the confidence function is computed again. Ifthe confidence function is low, more particles are added andresampled – resampling is performed such that the resultingparticles are an equal mix of the (highest weighted) currentand new particles. However, if the confidence is high, addingand resampling is not necessary. The process repeats until thesystem converges to a single particle.

5.4 Adjusted Particle Filter (APF) DesignWe propose an improvement to the particle filter design.The opportunity arises from the observation that in theconventional particle filter, a particle’s weight is typically pro-portional to its likelihood function (i.e., proximity to the truestate). Thus, when the system resamples, the concentrationof particles are proportionally greater at the higher weight

Figure 8: Flow chart of unified particle filter

particles. Over time, the hope is that one of the particleswould converge to the correct state. However, we ask: whynot move the particle to a state that maximizes the likelihood.In fact, given that the error variances of the transition andmeasurement functions are known, the best estimate canbe computed as a combination of the two. We could moveeach particle to the “best” state in its neighborhood, and thenperform the resampling step.

Figure 9 illustrates the idea for a case of three particles.The middle particle is propagated to time time1 using thetransition model DDt12

ij . Similarly, the measurement func-tion DDsk

ij produces an estimate of the new state. Thetransitioned and measured states along with their error dis-tributions are depicted. The two estimates are combinedusing a Kalman filter like approach, and the particle “ad-justed” to this best state. This particle is then propagatedfurther to time time2, which now has a smaller error variance.

We note that each particle essentially tracks a particular Nskij

integer ambiguity vectors. As a result, every carrier-phasemeasurement yields distinct orientation q for each of thesevectors. The benefits arise because: (1) For the correct inte-ger ambiguity vector, the error properties of the measurementfunction becomes Gaussian. Hence, the combining processindeed is like a Kalman filter, leading to faster convergence.(2) For incorrect integer ambiguity vectors, the gaussian errorproperties will not hold. Moreover, the transitioned particlewill be adjusted towards a wrong state because of incorrectinteger ambiguity resolution. Thus, the net outcome is fasterconvergence for the correct ambiguities, while disappearingparticles for incorrect ambiguities. Figure 7 shows the efficacyof the Adjusted Particle Filter (APF) even when KF diverges.The next section presents more detailed results.

6. EVALUATIONOur evaluation will comprehensively compare SafetyNet’s ori-

Page 9: Tracking Drone Orientation with Multiple GPS Receiverssynrg.csl.illinois.edu/papers/drone-attitude-camera.pdf · a flying drone, serving as the bedrock for maneuvers and stabilization

Figure 9: Adjusted Particle Filter (APF) results in fasterreliable convergence even with few particles.

entation accuracy against two reference points: (1) the realIMU of a professional-grade drone (using algorithms basedon [42]) and (2) high-precision estimates of ground truththrough computer vision, which is orthogonal to both iner-tial and GPS-based sensory modalities. All results are from11 flight sessions of ≈ 6 minutes each, across a diverse set ofweather and satellite conditions; we will also report 95th per-centile performance to capture robustness. We aim to answerthe following critical questions.

• What is SafetyNet’s accuracy in estimating a drone’s pitch,roll, and yaw during flight? (Figures 11, 12a)

• Is SafetyNet’s accuracy comparable to IMU, relative toground truth from computer vision? (Figure 12(b,c))

• Is SafetyNet robust to weather impairing satellite visibility?(Figures 13, 14)

• Is SafetyNet robust under aggressive maneuvers? (Fig-ures 11, 15)

• Is SafetyNet computationally practical for embedded, real-time applications? (Figure 16)

• Which modules of SafetyNet provide the largest perfor-mance gains? (Figure 17)

Our results show that SafetyNet demonstrates consistently de-pendable performance in each measure, and that all of Safe-tyNet’s modules are critical to the net outcome. We begin bydescribing the platform and methodology.

6.1 Experimental Platform and MethodologyWe conduct all experimentation using a 3DR X-8 octocopter(8-rotor), pictured in Figure 10. “X” implies 4 arms withunequal spacing – yielding greater stability of roll angleversus pitch (recall that pitch is the dominant motion whenan airplane takes off or lands, roll is dominant when theplane makes a left/right turn). Two rotors attach to eacharm: one above, one below. To the X-8, we mounted 4 u-Blox

Figure 10: Four off-the-shelf GPS modules, a Raspberry Pi,and a GoPro on an X-8 Octocopter

NEO-M8T multi-GNSS receivers, adjacent to the top motor ofeach arm. We also mounted a Raspberry Pi 2 near the drone’scenter of mass – the GPS receivers transfer the data to the Pivia USB at 5 Hz. IMU data at 10 Hz is recovered as binarylogs from the X-8’s USB interface. 4

A GoPro camera, affixed to the underbelly of the airframe,points vertically downwards. The GoPro captures video at 30fps with a fast shutter (to minimize motion blur and rollingshutter effects). We use ffmpeg to sample the video intostills at 7.5 fps (every fourth frame). We post-process theimages using Pix4D: advanced commercial photogrammetrysoftware that uses structure from motion (SfM) to perform 3Dregistration of each frame. From Pix4D’s outputs, we can re-cover a high-precision estimate of drone attitude (accuracy≈ 0.05◦ [40, 43]). post-process the images using Pix4D: ad-vanced commercial photogrammetry software that uses struc-ture from motion (SfM) to perform 3D registration of eachframe. Note: structure from motion would not be practical asa realtime IMU replacement – each 5-7 minute flight requiresseveral hours of processing on a server of 16 CPU cores, 32GB RAM, and CUDA on a high-end NVIDIA GRID K2 GPU.

6.2 Data Alignment for ComparisonSafetyNet and IMU data have precise GPS timestamps butthe GoPro does not – this makes precise comparison difficult.Therefore, at the beginning/end of each flight, we point theGoPro towards an Android phone displaying the current GPStime. Once the video frame is synchronized with the GPStime, we extrapolate and time stamp every frame – this ispossible because we extensively verified that the inter-framespacing of the GoPro is uniform (33.33 ms at 30 fps). Still,residual error remains from unpredictable OS delay in the An-droid application displaying the current time. We align theyaw angle sequence between SafetyNet and vision. We alsoperform this synchronization procedure with IMU to elimi-nate residual misalignment. Finally, we align the GoPro andGPS X-Y axes by finding the appropriate rotation matrix thateliminates offsets in yaw, pitch, and roll. We now present4Experiments were conducted as per FAA regulations. Oper-ator trained by a general aviation pilot, over relatively vacantfields, with flight height not exceeding 45m.

Page 10: Tracking Drone Orientation with Multiple GPS Receiverssynrg.csl.illinois.edu/papers/drone-attitude-camera.pdf · a flying drone, serving as the bedrock for maneuvers and stabilization

0 50 100 150 200 250

Time(s)

-50

0

50

Pitch (

degre

es)

SafetyNet

Vision

0 50 100 150 200 250

Time(s)

-60

-40

-20

0

20

40

60

Roll

(degre

es)

SafetyNet

Vision

0 50 100 150 200 250

Time(s)

-200

-100

0

100

200

Yaw

(degre

es)

SafetyNet

Vision

Figure 11: Time trace of SafetyNet orientation compared with Vision (a) Pitch (b) Roll and (c) Yaw. SafetyNet can trackhighly aggressive maneuvers well

0 5 10 15

Error (deg)

0

0.2

0.4

0.6

0.8

1

CD

F

Pitch

Roll

Yaw

0 5 10 15

Pitch/Roll Error (deg)

0

0.2

0.4

0.6

0.8

1

CD

FSafetyNet

IMU

0 5 10 15

Yaw Error (deg)

0

0.2

0.4

0.6

0.8

1

CD

F

SafetyNet

IMU

Figure 12: (a) CDF of errors for an hour of aggressive flight data shows the effectiveness of SafetyNet. (b) Comparison ofSafetyNet’s Pitch/Roll against IMU’s. (c) Comparison of Yaw against IMU. SafetyNet outperforms IMU in Yaw by 4x whilePitch/Roll is 30% worse.

results.

6.3 Performance Results(1) Overall Tracking Accuracy: Figure 11 depicts Safe-tyNet closely tracking (a) pitch, (b) roll, and (c) yaw againstPix4D/vision. Figure 12(a) aggregates 60 minutes of fly-ing data: median error of 2.07◦ for pitch, 1.38◦ for roll,and 0.61◦ for yaw. Higher accuracy for roll versus pitch isattributable to our X-shaped drone: the x-axis baseline is46% longer than the y-axis. Accordingly, roll is 49% moreaccurate. As expected, yaw accuracy is higher than pitch orroll as it is relatively less subject to GPS dilution of precisionfrom satellite geometry.

(2) Comparison with IMU: Figure 12(b and c) compares theaccuracy of SafetyNet with IMU, treating Pix4D/vision resultsas ground truth. SafetyNet is relatively less accurate (≈ 30%worse) than IMU in pitch and roll. However, SafetyNetcomprehensively outperforms IMU in yaw angle accuracy(≈ 300% more accurate / one-quarter error). Inferior yawaccuracy of IMU, relative to pitch and roll, is expected dueto reliance on magnetometer versus the accelerometer grav-ity vector. Again, SafetyNet’s superior yaw performance isexpected. We are encouraged that SafetyNet’s pitch/rollaccuracy is acceptable as a plausible IMU substitute.

(3) Robustness to Weather: Figure 13 plots median accu-racy over flights/flying conditions. Error bars denote 5thand 95th percentiles. Flights 7-11 were conducted in fog.Despite fewer detectable satellites, SafetyNet performanceremains consistent and robust. To emulate extreme weatherscenarios, Figure 14 shows accuracy with varied fractions ofsatellites artificially removed from processing. The medians

and 75th percentiles are mostly robust to satellite skipping.As evident from the 95th percentile line, worst case errorsstart increasing at 40%. Having fully exploited the u-BloxNEO-M8T multi-GNSS capabilities, SafetyNet maintains highsatellite count, increasing reliability.

1 2 3 4 5 6 7 8 9 10 11

Flight Number

0

2

4

6

8

10

Err

or

(deg)

Pitch

Roll

Yaw

Figure 13: Accuracy in SafetyNet is consistent over differ-ent flights under diverse satellite visibility

(4) Robustness to Maneuvers: Figures 11 and 15 showthat SafetyNet tracks aggressive maneuvers. Median erroralong with 5th and 95th percentiles is plotted as a functionof the pitch/roll/yaw rates. Tracking of extreme motions(120+◦ per second) becomes challenging because of satelliteblockage, cycle slips, and missing samples. Yet, SafetyNet’sParticle Filter Framework is robust enough to even the mostchallenging motions.

Page 11: Tracking Drone Orientation with Multiple GPS Receiverssynrg.csl.illinois.edu/papers/drone-attitude-camera.pdf · a flying drone, serving as the bedrock for maneuvers and stabilization

0 10 20 30 40 50 60 70

Ignored Satellites (%)

0

20

40

60

80

100

Err

or

(de

g)

95-%ile

75-%ile

Median

Figure 14: Number of outliers can increase with fewersatellite availability, but SafetyNet maximally exploitsmulti GNSS satellite availability

3 8 15 30 60 100 120+

Rate (deg/s)

0

2

4

6

8

10

Err

or

(de

g)

Pitch

Roll

Yaw

Figure 15: SafetyNet’s accuracy degrades gracefully withaggressiveness

(6) Implementability: Figure 16 plots the 95%-ile error inpitch, roll and yaw as a function of number of particles. Thereare two take-aways: (1) SafetyNet’s particle filter dramati-cally reduces error and (2) only few particles are required– the introduction of only a second particle yields a 300%improvement for pitch/roll. We need not trade away thereal time implementability of Kalman filters to gain accuracy.SafetyNet can be implemented in realtime on embeddedhardware.

(7) Gain by Module: Figure 17 isolates error reduction fromeach SafetyNet module: Integer Ambiguity resolution (IA),Outlier elimination (OT), Kalman Filter (KF), Simple ParticleFilter (PF), Adjusted Particle Filter (APF) and multi-GNSS(GS). IA suffers from cycle slips that OT can eliminate (7xgain in median and 2.5x at 95th). After OT, some wronginteger ambiguities may remain causing KF drift with slowconvergence back to a correct state. APF enables rapid con-vergence (100% median accuracy gains over both PF and KF/error reduced by 50%). Integrating GLONASS and SBAS [44]improves median performance and dramatically reducesworst case errors (33% median gain, 6x gain at 95th).

7. RELATED WORK

1 2 6 10 20

# Particles

0

10

20

30

40

95 %

-ile

Err

or

(deg) Pitch

Roll

Yaw

Figure 16: SafetyNet can eliminate outliers with very fewparticles

IA OT KF PF APF GS

Optimization Step

0

50

100

150

Err

or

(de

g)

95-%ile

75-%ile

Median

Figure 17: Errors vs Optimization Modules – Integer Am-biguity Resolution (IA), Outlier Elimination (OT), KalmanFilter (KF), Simple Particle Filter (PF), Adjusted ParticleFilter (PF) and Multi-GNSS (GS) together contribute to theperformance of SafetyNet

Relative GPS: Today’s state of the art in GPS relative localiza-tion are [13,14], which outperform DGPS [29] and RTK [30].Among these, APT in [14] is closer to SafetyNet and also usescarrier phases combined with double differentials; theyachieve sub-meter level accuracy. SafetyNet’s differencesinclude: (1) A completely different mathematical modelcompared to the modeling strategy used for relative localiza-tion in APT. (2) A notion of “soft decoding” where multipleuncertainties are propagated from the Particle Filter’s Cosmetric in SafetyNet. SafetyNet also picks new particles fromthe angular domain resulting in dramatic reduction of searchspace, whereas the search cube based recalibration in APTis highly complex. This also facilitates meeting the highaccuracy requirement in SafetyNet under a more challengingcycle slip rate. (3) SafetyNet incorporates the inclusion ofGLONASS, which is non-trivial due to satellites operatingon different frequencies (FDMA). (4) SafetyNet addresseshalf-cycle slips by resolving integer ambiguities in steps of 0.5– critical here, given the higher accuracy requirement (1-2cm) compared to APT.

Orientation/Velocity from Multi-GPS: While several workshave considered GPS for orientation of spacecraft, full-scaleaircraft, or ships [15,16,45], none address the dynamic errors

Page 12: Tracking Drone Orientation with Multiple GPS Receiverssynrg.csl.illinois.edu/papers/drone-attitude-camera.pdf · a flying drone, serving as the bedrock for maneuvers and stabilization

for a small drone. While [17] controls a model helicopterusing multi-GPS, the motions are highly constrained (slowchanges less than 10◦). SafetyNet is unique in addressingdynamic integer ambiguities across an aggressive flight (an-gular changes up to 150◦/s). Solutions using single receiver,multi-antenna GPS [46] will also benefit from SafetyNet fordynamic error handling. Work in [47] uses a moving antennafor orientation estimates. While interesting, the authors notethat the motion has to be slow enough to not break GPSlocks, hence it is not possible to adopt such solutions for highdynamic flights. Angular velocity determination using multi-GPS is explored in [48, 49]. However, SafetyNet is unique inintegrating angular velocity with double differentials (acrosstime and satellites) into a Particle Filtering framework.

Multi-GNSS Fusion: Several works have attempted tofuse GPS with GLONASS satellite systems for enhancedrelative positioning [50–52]. However, each use pseudo-range data only or assume a specialized multi-antennaGPS/GLONASS receiver with a single clock. SafetyNet ex-ploits time-differenced carrier phase measures to leverageGLONASS data in double differentials with off-the-shelfmulti-GNSS receivers.

Reliability for IMU: Drift on gyroscopes [53–55] and in-terferences to magnetic compasses [56–58] is well knownin smartphone applications and many of these researchershave proposed application specific inferencing techniques tohandle them. While the same error properties can extend todrones as well, the results could be catastrophic and errorhandling is very critical. Work in [8] proposes inferencingtechniques for correcting IMU errors from engine vibrationsfor drones. Similarly, [9] models magnetic sources of inter-ferences to nullify the effect on IMU. Work in [11] modelsabove similar noisy sources affecting IMU updates into asophisticated Kalman filter. In contrast to these works Safe-tyNet offers a completely orthogonal solution relying only onGNSS receivers.

IMU/GPS fusion: Fusion of IMU sensors like accelerometers,gyroscopes, magnetometer and GPS has been extensivelyused for orientation tracking [59]. Extended Kalman Fil-ters [35,42,60,61] have been used for handling non-linearity.UKF, Sigma Point Filters and Particle filters have also beenused for better accuracy with higher complexity [62–65].While complementary to SafetyNet in reducing orientationuncertainty, these do not address IMU failure.

Orientation from Vision: Pitch/roll angles can be estimatedby tracking the horizon [66,67]. Optical flow [68], stereo vi-sion [69] and feature tracking [70–73] have been proposedas vision-based IMU supplements. Vision techniques are com-plementary to SafetyNet. Offline, we leverage high-fidelitystructure from motion to validate our accuracy.

8. FUTURE WORKWe briefly discuss a few points of future work.

• Fusion with IMU: SafetyNet is a completely orthogonalsolution, but it is possible to combine GPS with IMU for evenbetter accuracy. We leave this to future work.

• GPS Sampling Rate: Sampling rate of our GPS modules

was only 5 Hz, but all of our techniques are fundamentallyapplicable to high rate GPS receivers.

• Latency: The processing power on drones continues toevolve – various drones offer diverse computing capabilities.Our future work would need to carefully profile the latency tocharacterize the kinds of drones that could execute SafetyNetin real-time.

• Vehicles: Orientation estimation of self driving cars can alsobenefit from SafetyNet, adding an extra layer of reliability.

9. CONCLUSIONGrand visions abound for applications of drones. WhileAmazon, Google, and others have made great strides towardspackage delivery, etc., drones remain an unacceptable hazardto persons and property. By comprehensively addressing IMUfailure through GNSS – a failover completely orthogonal toinertial methods, SafetyNet progresses the state of the art indrone safety.

10. ACKNOWLEDGMENTSWe sincerely thank the reviewers and the anonymous shep-herd for their valuable feedback. We are also grateful to NSF(CNS - 1423455) for partially funding the research.

11. REFERENCES[1] Y. Zhang, A. Chamseddine, C. Rabbath, B. Gordon,

C.-Y. Su, S. Rakheja, C. Fulford, J. Apkarian, andP. Gosselin, “Development of advanced fdd and ftctechniques with application to an unmanned quadrotorhelicopter testbed,” Journal of the Franklin Institute,vol. 350, no. 9, pp. 2396–2422, 2013.

[2] “diydrones.” http://diydrones.com/forum/topics/problem-quad-copter-spins-around-itself.

[3] “Google forum.” https://groups.google.com/forum/#!msg/drones-discuss/fnWYM48pGys/kwPI_q_qv0QJ.

[4] “Drone crash survey.”https://docs.google.com/forms/d/1R4MaX8iZWRoyzKJ6rKr0U-p3x-6j0ZVm5a0p83hHty0/viewanalytics.

[5] G. De Pasquale and A. Somà, “Reliability testingprocedure for mems imus applied to vibratingenvironments,” Sensors, vol. 10, no. 1, pp. 456–474,2010.

[6] “3d robotics: Pixhawk.”https://3dr.com/pixhawk-is-ready-for-takeoff/.

[7] “Dji phantom 4.”https://www.dji.com/product/phantom-4/info.

[8] X. Liu and R. Randall, “Blind source separation ofinternal combustion engine piston slap from othermeasured vibration signals,” Mechanical Systems andSignal Processing, vol. 19, no. 6, pp. 1196–1208, 2005.

[9] M. Wells, Attenuating magnetic interference in a UAVsystem. PhD thesis, Carleton University Ottawa, 2008.

[10] V. Malyavej, P. Torteeka, S. Wongkharn, andT. Wiangtong, “Pose estimation of unmanned groundvehicle based on dead-reckoning/gps sensor fusion byunscented kalman filter,” in ElectricalEngineering/Electronics, Computer, Telecommunicationsand Information Technology, 2009. ECTI-CON 2009. 6th

Page 13: Tracking Drone Orientation with Multiple GPS Receiverssynrg.csl.illinois.edu/papers/drone-attitude-camera.pdf · a flying drone, serving as the bedrock for maneuvers and stabilization

International Conference on, vol. 1, pp. 395–398, IEEE,2009.

[11] Y. S. Suh, “Attitude estimation by multiple-modekalman filters,” IEEE Transactions on industrialelectronics, vol. 53, no. 4, pp. 1386–1389, 2006.

[12] Y.-C. Lai and S.-S. Jan, “Attitude estimation based onfusion of gyroscopes and single antenna gps for smalluavs under the influence of vibration,” GPS solutions,vol. 15, no. 1, pp. 67–77, 2011.

[13] W. Hedgecock et al., “Regtrack: a differential relativegps tracking solution,” in Proceedings of the 11thAnnual International Conference on Mobile Systems,Applications, and Services – MOBISYS 2013, 2013.

[14] W. Hedgecock et al., “Accurate real-time relativelocalization using single-frequency gps,” in SENSYS2014, ACM, 2014.

[15] C. E. Cohen, E. G. Lightsey, B. W. Parkinson, and W. A.Feess, “Space flight tests of attitude determinationusing gps,” International Journal of SatelliteCommunications, 1994.

[16] G. Lachapelle, M. Cannon, G. Lu, and B. Loncarevic,“Shipborne gps attitude determination duringmmst-93,” Oceanic Engineering, IEEE Journal of, 1996.

[17] A. R. Conway, Autonomous control of an unstable modelhelicopter using carrier phase GPS only. PhD thesis,stanford university, 1995.

[18] A. Thiagarajan, L. Ravindranath, K. LaCurts,S. Madden, H. Balakrishnan, S. Toledo, and J. Eriksson,“Vtrack: Accurate, energy-aware road traffic delayestimation using mobile phones,” in Proceedings of the7th ACM Conference on Embedded Networked SensorSystems, SenSys ’09, (New York, NY, USA), pp. 85–98,ACM, 2009.

[19] J. Paek, J. Kim, and R. Govindan, “Energy-efficientrate-adaptive gps-based positioning for smartphones,”in Proceedings of the 8th International Conference onMobile Systems, Applications, and Services, MobiSys ’10,(New York, NY, USA), pp. 299–314, ACM, 2010.

[20] “Lipo 4s battery.” http://www.droneshop.biz/4s-10000mah-lipo-low-weight-high-efficiency.html.

[21] “Dji matrice 600.”http://www.dji.com/product/matrice600.

[22] “Magnetic interference causing crashes.”http://diydrones.com/forum/topics/crazy-magnetic-interference.

[23] P. Misra and P. Enge, Global Positioning System: Signals,Measurements and Performance Second Edition. Lincoln,MA: Ganga-Jamuna Press, 2006.

[24] R. B. Langley, “Glonass: review and update,” GPSworld, vol. 8, no. 7, pp. 46–51, 1997.

[25] “Galileo.” https://en.wikipedia.org/wiki/Galileo_(satellite_navigation).

[26] B. Zhang, P. J. Teunissen, and D. Odijk, “A novelun-differenced ppp-rtk concept,” Journal of Navigation,vol. 64, no. S1, pp. S180–S191, 2011.

[27] D. Laurichesse, F. Mercier, J.-P. BERTHIAS, P. Broca,and L. Cerri, “Integer ambiguity resolution onundifferenced gps phase measurements and itsapplication to ppp and satellite precise orbitdetermination,” Navigation, vol. 56, no. 2,pp. 135–149, 2009.

[28] E. Kaplan and C. Hegarty, Understanding GPS: principlesand applications. Artech house, 2005.

[29] B. W. Parkinson and P. K. Enge, “Differential gps,”Global Positioning System: Theory and applications.,vol. 2, pp. 3–50, 1996.

[30] C. Mekik and M. Arslanoglu, “Investigation onaccuracies of real time kinematic gps for gisapplications,” Remote Sensing, vol. 1, no. 1, pp. 22–35,2009.

[31] G. Blewitt, “Basics of the gps technique: observationequations,” Geodetic applications of GPS, pp. 10–54,1997.

[32] J. Diebel, “Representing attitude: Euler angles, unitquaternions, and rotation vectors,” Matrix, vol. 58,no. 15-16, pp. 1–35, 2006.

[33] “Rotation vectors.” http://farside.ph.utexas.edu/teaching/301/lectures/node100.html.

[34] “Vector cross product.” http://soe.rutgers.edu/~meer/GRAD561/ADD/antisymm.pdf.

[35] E. J. Lefferts, F. L. Markley, and M. D. Shuster, “Kalmanfiltering for spacecraft attitude estimation,” Journal ofGuidance, Control, and Dynamics, vol. 5, no. 5,pp. 417–429, 1982.

[36] C. E. Cohen and B. W. Parkinson, “Integer ambiguityresolution of the gps carrier for spacecraft attitudedetermination,” in Guidance and Control 1992, vol. 1,pp. 107–118, 1992.

[37] A. Conway, P. Montgomery, S. Rock, R. Cannon, andB. Parkinson, “A new motion-based algorithm for gpsattitude integer resolution,” Navigation, vol. 43, no. 2,pp. 179–190, 1996.

[38] P. De Jonge and C. Tiberius, “The lambda method forinteger ambiguity estimation: implementation aspects,”Publications of the Delft Computing Centre, LGR-Series,vol. 12, no. 12, pp. 1–47, 1996.

[39] E. Sutton and R. Collins, “Calibration of differentialphase map compensation using single axis rotation,” inPROCEEDINGS OF ION GPS, vol. 11, pp. 1831–1842,INSTITUTE OF NAVIGATION, 1998.

[40] J. Vallet, F. Panissod, C. Strecha, and M. Tracol,“Photogrammetric performance of an ultra light weightswinglet uav,” in UAV-g, no. EPFL-CONF-169252, 2011.

[41] J. O. Street, R. J. Carroll, and D. Ruppert, “A note oncomputing robust regression estimates via iterativelyreweighted least squares,” The American Statistician,vol. 42, no. 2, pp. 152–154, 1988.

[42] M. E. Pittelkau, “Rotation vector in attitudeestimation,” Journal of Guidance, Control, andDynamics, vol. 26, no. 6, pp. 855–860, 2003.

[43] S. Harwin and A. Lucieer, “Assessing the accuracy ofgeoreferenced point clouds produced via multi-viewstereopsis from unmanned aerial vehicle (uav)imagery,” Remote Sensing, vol. 4, no. 6, pp. 1573–1599,2012.

[44] “Wide area augmentation system.” https://en.wikipedia.org/wiki/Wide_Area_Augmentation_System.

[45] Y. Li, K. Zhang, C. Roberts, and M. Murata, “On-the-flygps-based attitude determination using single-anddouble-differenced carrier phase measurements,” GPSSolutions, vol. 8, no. 2, pp. 93–102, 2004.

[46] D. E. Kirkpatrick, “Design of a hardware platform for

Page 14: Tracking Drone Orientation with Multiple GPS Receiverssynrg.csl.illinois.edu/papers/drone-attitude-camera.pdf · a flying drone, serving as the bedrock for maneuvers and stabilization

gps-based orientation sensing,” 2015.[47] A. G. Evans, “Roll, pitch, and yaw determination using

a global positioning system receiver and an antennaperiodically moving in a plane,” Marine Geodesy,vol. 10, no. 1, pp. 43–52, 1986.

[48] M. Ueno, R. Santerre, and A. Kleusberg, “Directdetermination of angular velocity using gps,” Journal ofNavigation, vol. 53, no. 02, pp. 371–379, 2000.

[49] P. Montgomery, H. Uematsu, and B. Parkinson,“Analysis of angular velocity determination using gps,”in ION GPS-94, pp. 697–706, 1994.

[50] J. Wang, M. P. Stewart, and M. Tsakiri, “Modellingglonass measurements for precise positioning,” SurveyReview, vol. 36, no. 280, pp. 110–120, 2001.

[51] S. Han, L. Dai, and C. Rizos, “A new data processingstrategy for combined gps/glonass carrier phase-basedpositioning,” in Proc. ION GPS-99, pp. 1619–1627,1999.

[52] T. Tsujii, M. Harigae, T. Inagaki, and T. Kanai, “Flighttests of gps/glonass precise positioning versus dualfrequency kgps profile,” Earth, planets and space,vol. 52, no. 10, pp. 825–829, 2000.

[53] G. Shen, Z. Chen, P. Zhang, T. Moscibroda, andY. Zhang, “Walkie-markie: indoor pathway mappingmade easy,” in Proceedings of the 10th USENIXconference on Networked Systems Design andImplementation – NSDI 2013, pp. 85–98, USENIXAssociation, 2013.

[54] A. Rai, K. K. Chintalapudi, V. N. Padmanabhan, andR. Sen, “Zee: zero-effort crowdsourcing for indoorlocalization,” in Proceedings of the 18th annualinternational conference on Mobile computing andnetworking – MOBICOM 2012, pp. 293–304, ACM,2012.

[55] P. Zhou, M. Li, and G. Shen, “Use it free: Instantlyknowing your phone attitude,” in Proceedings of the20th annual international conference on Mobilecomputing and networking – MOBICOM 2014,pp. 605–616, ACM, 2014.

[56] H. Wang, S. Sen, A. Elgohary, M. Farid, M. Youssef, andR. R. Choudhury, “No need to war-drive: unsupervisedindoor localization,” in Proceedings of the 10thinternational conference on Mobile systems, applications,and services – MOBISYS 2012, pp. 197–210, ACM,2012.

[57] J. G. Manweiler, P. Jain, and R. Roy Choudhury,“Satellites in our pockets: an object positioning systemusing smartphones,” in Proceedings of the 10thinternational conference on Mobile systems, applications,and services – MOBISYS 2012, pp. 211–224, ACM,2012.

[58] A. T. Mariakakis, S. Sen, J. Lee, and K.-H. Kim, “Sail:Single access point-based indoor localization,” inProceedings of the 12th Annual International Conferenceon Mobile Systems, Applications, and Services –MOBISYS 2014, MobiSys ’14, (New York, NY, USA),pp. 315–328, ACM, 2014.

[59] J. L. Crassidis, F. L. Markley, and Y. Cheng, “Survey ofnonlinear attitude estimation methods,” Journal ofguidance, control, and dynamics, vol. 30, no. 1,pp. 12–28, 2007.

[60] D. B. Kingston and R. W. Beard, “Real-time attitude and

position estimation for small uavs using low-costsensors,” in AIAA 3rd unmanned unlimited technicalconference, Workshop and exhibit, pp. 2004–6488, sn,2004.

[61] W. Y. Liang, W. T. Miao, L. J. Hong, X. C. Lei, andZ. Chen, “Attitude estimation for small helicopter usingextended kalman filter,” in Robotics, Automation andMechatronics, 2008 IEEE Conference on, pp. 577–581,IEEE, 2008.

[62] D. Marina, H. Garcia, F. J. Pereda, J. M. Giron-Sierra,and F. Espinosa, “Uav attitude estimation usingunscented kalman filter and triad,” IndustrialElectronics, IEEE Transactions on, vol. 59, no. 11,pp. 4465–4474, 2012.

[63] J. L. Crassidis and F. L. Markley, “Unscented filteringfor spacecraft attitude estimation,” Journal of guidance,control, and dynamics, vol. 26, no. 4, pp. 536–542,2003.

[64] G. G. Rigatos, “Nonlinear kalman filters and particlefilters for integrated navigation of unmanned aerialvehicles,” Robotics and Autonomous Systems, vol. 60,no. 7, pp. 978–995, 2012.

[65] R. Van Der Merwe and E. A. Wan, “Sigma-point kalmanfilters for integrated navigation,” in Proceedings of the60th Annual Meeting of the Institute of Navigation(ION), pp. 641–654, 2004.

[66] S. Thurrowgood, D. Soccol, R. J. Moore, D. Bland, andM. V. Srinivasan, “A vision based system for attitudeestimation of uavs,” in Intelligent Robots and Systems,2009. IROS 2009. IEEE/RSJ International Conferenceon, pp. 5725–5730, IEEE, 2009.

[67] C. Demonceaux, P. Vasseur, and C. Pégard,“Omnidirectional vision on uav for attitudecomputation,” in Robotics and Automation, 2006. ICRA2006. Proceedings 2006 IEEE International Conferenceon, pp. 2842–2847, IEEE, 2006.

[68] G. L. Barrows, J. S. Chahl, and Y. V. Srinivasan,“Biomimetic visual sensing and flight control,” in Proc.Bristol UAV Conf, pp. 159–168, 2002.

[69] R. J. Moore, S. Thurrowgood, D. Bland, D. Soccol, andM. V. Srinivasan, “A stereo vision system for uavguidance,” in Intelligent Robots and Systems, 2009.IROS 2009. IEEE/RSJ International Conference on,pp. 3386–3391, IEEE, 2009.

[70] C. Forster, M. Pizzoli, and D. Scaramuzza, “Svo: Fastsemi-direct monocular visual odometry,” in Roboticsand Automation (ICRA), 2014 IEEE InternationalConference on, pp. 15–22, IEEE, 2014.

[71] J. Engel, T. Schöps, and D. Cremers, “Lsd-slam:Large-scale direct monocular slam,” in ComputerVision–ECCV 2014, pp. 834–849, Springer, 2014.

[72] G. Klein and D. Murray, “Parallel tracking and mappingfor small ar workspaces,” in Mixed and AugmentedReality, 2007. ISMAR 2007. 6th IEEE and ACMInternational Symposium on, pp. 225–234, IEEE, 2007.

[73] N. Karlsson, E. Di Bernardo, J. Ostrowski, L. Goncalves,P. Pirjanian, and M. E. Munich, “The vslam algorithmfor robust localization and mapping,” in Robotics andAutomation, 2005. ICRA 2005. Proceedings of the 2005IEEE International Conference on, pp. 24–29, IEEE,2005.