9
ASPRS 2005 Annual Conference Baltimore, Maryland March 7-11, 2005 MULTI-SENSOR TRIANGULATION Eugene Rose, Kiril Fradkin Sensor Systems, Incorporated 103A Carpenter Drive Sterling, VA 20164 [email protected], [email protected] ABSTRACT Triangulation, sometimes called Bundle Block Adjustment, or simply “Block Adjustment”, is a standard operation (Slama 1980), which is well known to photogrammetrists and cartographers. Any imagery which is used for making maps, charts, geospatial vector data , or other geospatial products has more than likely been block adjusted at some point in its processing history. At Sensor Systems, Inc., we are bringing this powerful technology into the hands of non-photogrammetrists. The challenge is to make this fairly complex process fast, easy to use, and effective at improving the end-products of these users. In this paper, we present our solutions to these challenges. We discuss triangulation of imagery from multiple sensors including the Ikonos, QuickBird, and OrbView commercial satellite systems. We describe various models of pushbroom sensors including comparison of position, attitude and rate models which we have applied to these high resolution satellite systems,. We present a description of methods for the automatic generation of tiepoints using image correlation and other features of a triangulation system designed for very rapid performance in an essentially "hands-off" environment. A high-level description of a software system designed to perform multi- sensor triangulation within a "plug-in" architecture is discussed, in the context of enabling collaborative sensor model development. Results of experimental triangulation of various block configurations are presented. ISSUES Our multi-sensor triangulation is based on the well-known collinearity equations (Wolf, Dewitt, 2000). In addition to triangulation, the same collinearity equations are used to implement “Space Resection” and “Space Intersection”. “Space Resection” solves for the sensor EO, holding the other parameters fixed. Space Resection is used prior to creating an orthophoto. It requires ground control points, the number of which depends on the complexity of the sensor model. Space Intersection is the process of obtaining a geolocation from multiple images, either in stereo, or with more than 2 images. This is the process employed in precision geolocation systems. In a traditional digital Photogrammetric workstation (DPW) approach, the process of triangulation begins with the user specifying a sensor model by entering information on the focal length, principal point offset, and exterior orientation information, and adjusting a large block of imagery collected from a single sensor. Sensor Systems’ RemoteView product is an example of a class of high-performance image processing software known as Electronic Light Table (ELT). In this type of software, the typical user is not a photogrammetrist or remote sensing specialist, but rather a trained image analyst, one whose primary function is the extraction of feature information from imagery in a highly dynamic, real-time environment. To this user, the block adjustment process is just the starting point in performing the primary job of extracting precise geographic coordinate information from imagery. For block adjustment to be feasible in this environment, the software architecture must be able to automatically initialize sensor models directly from image metadata and pre-stored information, and to present an extremely simple wizard- based user interface, even if this architecture removes some of the more “advanced” adjustment opportunities found in the DPW. In addition, the software architecture should allow for third-party development of sensor model libraries which can simply be installed in the software directory on the system and automatically used by the system for any of the standard Photogrammetric operations, resection, intersection, or triangulation. This environment requires a multi-sensor approach, that is the ability to mix imagery collected from different sensors in the same triangulation. These sensors might include standard frame cameras, pushbroom, or whiskbroom models. Mixed mode imagery, radar, electro-optical, thermal, etc. is another critical requirement.

MULTI-SENSOR TRIANGULATION ABSTRACT · 2012-05-24 · RemoteView triangulation with adjustable RPC model. The kit includes results calculated by Space Imaging using the RPC sensor

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: MULTI-SENSOR TRIANGULATION ABSTRACT · 2012-05-24 · RemoteView triangulation with adjustable RPC model. The kit includes results calculated by Space Imaging using the RPC sensor

ASPRS 2005 Annual Conference Baltimore, Maryland ♦ March 7-11, 2005

MULTI-SENSOR TRIANGULATION Eugene Rose, Kiril Fradkin

Sensor Systems, Incorporated 103A Carpenter Drive

Sterling, VA 20164 [email protected], [email protected]

ABSTRACT

Triangulation, sometimes called Bundle Block Adjustment, or simply “Block Adjustment”, is a standard

operation (Slama 1980), which is well known to photogrammetrists and cartographers. Any imagery which is used for making maps, charts, geospatial vector data , or other geospatial products has more than likely been block adjusted at some point in its processing history. At Sensor Systems, Inc., we are bringing this powerful technology into the hands of non-photogrammetrists. The challenge is to make this fairly complex process fast, easy to use, and effective at improving the end-products of these users.

In this paper, we present our solutions to these challenges. We discuss triangulation of imagery from multiple

sensors including the Ikonos, QuickBird, and OrbView commercial satellite systems. We describe various models of pushbroom sensors including comparison of position, attitude and rate models which we have applied to these high resolution satellite systems,. We present a description of methods for the automatic generation of tiepoints using image correlation and other features of a triangulation system designed for very rapid performance in an essentially "hands-off" environment. A high-level description of a software system designed to perform multi-sensor triangulation within a "plug-in" architecture is discussed, in the context of enabling collaborative sensor model development. Results of experimental triangulation of various block configurations are presented.

ISSUES

Our multi-sensor triangulation is based on the well-known collinearity equations (Wolf, Dewitt, 2000). In addition to triangulation, the same collinearity equations are used to implement “Space Resection” and “Space Intersection”. “Space Resection” solves for the sensor EO, holding the other parameters fixed. Space Resection is used prior to creating an orthophoto. It requires ground control points, the number of which depends on the complexity of the sensor model. Space Intersection is the process of obtaining a geolocation from multiple images, either in stereo, or with more than 2 images. This is the process employed in precision geolocation systems.

In a traditional digital Photogrammetric workstation (DPW) approach, the process of triangulation begins with

the user specifying a sensor model by entering information on the focal length, principal point offset, and exterior orientation information, and adjusting a large block of imagery collected from a single sensor. Sensor Systems’ RemoteView product is an example of a class of high-performance image processing software known as Electronic Light Table (ELT). In this type of software, the typical user is not a photogrammetrist or remote sensing specialist, but rather a trained image analyst, one whose primary function is the extraction of feature information from imagery in a highly dynamic, real-time environment. To this user, the block adjustment process is just the starting point in performing the primary job of extracting precise geographic coordinate information from imagery. For block adjustment to be feasible in this environment, the software architecture must be able to automatically initialize sensor models directly from image metadata and pre-stored information, and to present an extremely simple wizard-based user interface, even if this architecture removes some of the more “advanced” adjustment opportunities found in the DPW.

In addition, the software architecture should allow for third-party development of sensor model libraries which

can simply be installed in the software directory on the system and automatically used by the system for any of the standard Photogrammetric operations, resection, intersection, or triangulation.

This environment requires a multi-sensor approach, that is the ability to mix imagery collected from different

sensors in the same triangulation. These sensors might include standard frame cameras, pushbroom, or whiskbroom models. Mixed mode imagery, radar, electro-optical, thermal, etc. is another critical requirement.

Page 2: MULTI-SENSOR TRIANGULATION ABSTRACT · 2012-05-24 · RemoteView triangulation with adjustable RPC model. The kit includes results calculated by Space Imaging using the RPC sensor

ASPRS 2005 Annual Conference Baltimore, Maryland ♦ March 7-11, 2005

In the ELT environment, the process of collecting tiepoints needs to be over 90% automated. The process of automatically collecting tiepoints has been addressed in numerous papers and text books. An approach which shows promise to the specific sensor models and types of imagery investigated to date is explained.

Our solution to these issues is described in the following sections.

Software Architecture As shown in Figure 1, a “plug-in” sensor model architecture achieves several important objectives.

Figure 1 – Software Architecture.

First, it provides for a standard interface through which sensor-specific information is passed from the sensor

models to the applications. This provides third-party software developers with a means to incorporate all of the Photogrammetric operations, resection, intersection, and triangulation, for a specific sensor, by producing a separately compiled software library (“dll” in Windows, or “shared object library” in Unix). The sensor model developer concentrates on the relatively few methods that the specific model is responsible for, and does not need to be concerned with the details of the Photogrammetric operations. This relatively small set of methods is outlined in Table 1.

Ikonos Sensor Model

Third-party sensor model

QuickBird Sensor Model

Sensor Model “Plug-ins”

Sensor Interface: - Common API

Multi-Sensor Triangulation

Rigorous Ortho

Multi-Image Positioning

User Interface

Page 3: MULTI-SENSOR TRIANGULATION ABSTRACT · 2012-05-24 · RemoteView triangulation with adjustable RPC model. The kit includes results calculated by Space Imaging using the RPC sensor

ASPRS 2005 Annual Conference Baltimore, Maryland ♦ March 7-11, 2005

Sensor Model Plug-in Method Purpose Ground to Image. Calculate image coordinates from ground coordinates –

this is the solution of the collinearity equations. Image To Ground (with elevation) This method allows for the calculation of approximate

ground position of tiepoints, given their image coordinates.

Set/Get Adjustable Parameters Allows for an iterative triangulation solution, where intermediate results are passed to the sensor model to calculate the unknowns for the next iteration.

Partials of ground-to image function with respect to image parameters.

Allows for the iterative adjustment of sensor parameters in triangulation and space resection.

Partials of ground-to image function with respect to ground parameters.

Allows for the iterative adjustment of ground parameters in triangulation and space intersection.

Sensor parameter initial values. Allows for the initialization of the iterative least squares processes.

Sensor parameter variance/covariance parameters Determines the amount of adjustment which is applied to the parameters. Well known parameters, such as satellite location determined by INS with GPS, can be given small variances (5m).

Table 1. Sensor Model API.

The plug-in software architecture allows for the mixing of sensor models from different sensors in the same

Photogrammetric process. For example, for block adjustment, the main processing loop will query each involved sensor model in turn for its portion of the normal equation matrices, - the partials, parameter values and weights for its “portion” of the normal equations. The main loop inserts the values into the correct sub-matrices. As long as a sensor model supports the API in Table 1, it can be combined with any other sensor that also supports the API. Commercial Satellite Sensor Models

Commercial high-resolution Satellite sensor models currently include Space Imaging’s IKONOS, DigitalGlobe’s QuickBird, ORBIMAGE’s OrbView, and several other planned systems.

Pushbroom Sensor Models

The standard pushbroom sensor model models the dynamic behavior of the sensor as a series of discrete frame camera sensor models, one for each line of the image.

As pointed out by (Dial and Grodecki, 2003), pushbroom sensor models are frequently over-parameterized.

Without large quantities of ground control, it is nearly impossible to separate out the effects of the different parameters from one another – sometimes referred to as “recovering” parameters. Along-track and cross-track satellite position error, for example, are highly correlated with pitch and roll angles, when those angles are described in platform space. Ikonos

Space Imaging’s IKONOS system is unique in that the data required to model the system as a standard pushbroom or whiskbroom sensor model is not provided. Rather, Space Imaging promotes the use of a simple polynomial sensor model in which the typical 6 or more parameters of exterior orientation are modeled by 2 or 4 image-space parameters (Dial, Grodecki, 2003). Others (Toutin, 1983), have suggested that a semi-rigorous model instantiated from the basic IKONOS metadata, including the look azimuth and elevation, is more accurate.

Page 4: MULTI-SENSOR TRIANGULATION ABSTRACT · 2012-05-24 · RemoteView triangulation with adjustable RPC model. The kit includes results calculated by Space Imaging using the RPC sensor

ASPRS 2005 Annual Conference Baltimore, Maryland ♦ March 7-11, 2005

Unfortunately, the Toutin method is not well publicized and is only implemented in a proprietary software solution. In our application, we have employed the Dial/Grodecki model with good results. QuickBird-2

DigitalGlobe Incorporated’s QuickBird-2 sensor can be rigorously modeled as a standard pushbroom sensor,

provided the support data is available. This data is made available to selected DigitalGlobe development partners, including Sensor Systems. The data includes interior orientation (focal point and detector mounting), average line rate of the pushbroom, sensor to platform coordinate transformation data, and ephemeris and attitude data.

OrbView-3

ORBIMAGE Corporation’s OrbView-3 sensor is not a pushbroom sensor, but rather a whiskbroom sensor. In the Orbview model, the attitude and ephemeris data is provided as well as interior orientation.

EROS 1A

Time did not permit the inclusion of a detailed evaluation of EROS 1A data for this study.

AUTOMATIC TIE POINT COLLECTION Tie point collection is one of the most tedious and time consuming steps in the triangulation process. To

facilitate this step Remote View supports an automatic tie point collection mode. We have successfully tested the method on data from one sensor at the same scale, as well as combinations of multi-spectral and panchromatic and data from all of the different sensors in various combinations.

An example of a tie point collection process ROI is presented in Figure 2.

Figure 2. Tie point collection process in an ROI. Green crosses – matched points. Red crosses – rejected

mismatches. Big green cross – selected tie point. An example of the completed auto-tiepoint process is presented in Figure 3.

Page 5: MULTI-SENSOR TRIANGULATION ABSTRACT · 2012-05-24 · RemoteView triangulation with adjustable RPC model. The kit includes results calculated by Space Imaging using the RPC sensor

ASPRS 2005 Annual Conference Baltimore, Maryland ♦ March 7-11, 2005

Figure 3. Auto-tiepoint process. Green crosses - matched points. (Screenshot of RemoteView Pro for Windows).

SELECTED RESULTS

Selected results of triangulation are presented here. Obviously there is more data than can be shown here. Interested parties can contact the authors for details.

At the time of writing this paper, we have tested triangulation with a limited set of Ikonos, QuickBird and OrbView data. We have mixed these sensors in various combinations, and included tests of mixed multi-spectral and panchromatic data.

Mixed Sensor Blocks

A block of images consisting of 1 Ikonos panchromatic image with resolution 1.0 meters, one Ikonos multispectral image with resolution 4 meters and one QuickBird panchromatic image with resolution about 0.62 meters was adjusted. The auto-tiepoint process worked successfully, generating 18 tiepoints in approximately 3 minutes. Thus, 3 different image scales and 2 different sensor models are involved. There were no ground control points available to test the absolute accuracy of this block.

Ikonos Block

Space Imaging provided Sensor Systems with a set of images with ground control points for validation of the RemoteView triangulation with adjustable RPC model. The kit includes results calculated by Space Imaging using the RPC sensor model, and a comparison of these values with the results computed at the Ground Station, using the full rigorous projection model.

Page 6: MULTI-SENSOR TRIANGULATION ABSTRACT · 2012-05-24 · RemoteView triangulation with adjustable RPC model. The kit includes results calculated by Space Imaging using the RPC sensor

ASPRS 2005 Annual Conference Baltimore, Maryland ♦ March 7-11, 2005

The kit includes 4 Ikonos images, with RPCs, of the San Diego area. For these 4 images, a total of 33 tiepoints

was generated automatically in less than 5 minutes. In addition to the GCP, 5 control points were used as checkpoints. Prior to adjustment, the RMS error at the checkpoints was: 3.16 meters in latitude, 4.71 meters in longitude, and 3.88 meters in height. After adjustment, these errors were reduced to 0.6 meters in latitude, 1.9 meters in longitude, and no improvement in height. Since the checkpoints were measured monoscopically, the height is based on the underlying USGS DEM data both before and after adjustment.

In general, the “non-rigorous” RPC model appears to be valid for the Ikonos imagery which we have examined

to date.

QuickBird Stereo Pair, no ground control

A set of 4 QuickBird images of the Phoenix, Arizona area, with support data and ground control points was provided to Sensor Systems by DigitalGlobe for purposes of validating the RemoteView triangulation with the rigorous QuickBird sensor model. The block consists of 2 images from one pass and 2 images from another pass. Within each pass, the initial relative orientation of the images is quite good, within about 10 meters. Across the paths however, there is a large misalignment of nearly 700 meters. Figure 4 shows one of the cross-track stereo pairs, before adjustment.

Page 7: MULTI-SENSOR TRIANGULATION ABSTRACT · 2012-05-24 · RemoteView triangulation with adjustable RPC model. The kit includes results calculated by Space Imaging using the RPC sensor

ASPRS 2005 Annual Conference Baltimore, Maryland ♦ March 7-11, 2005

Figure 4. QuickBird Stereo Pair before adjustment.

Note the large displacement in the north-south direction as seen in the major highway. Figure 5 shows the same stereo pair after block adjustment. For this adjustment, no ground control points were used. 24 tiepoints were generated automatically in approximately 1 minute. The relatively large amount of time required to generate tiepoints reflects the poor initial Relative Orientation of the images.

Page 8: MULTI-SENSOR TRIANGULATION ABSTRACT · 2012-05-24 · RemoteView triangulation with adjustable RPC model. The kit includes results calculated by Space Imaging using the RPC sensor

ASPRS 2005 Annual Conference Baltimore, Maryland ♦ March 7-11, 2005

Figure 5. QuickBird stereo pair after adjustment. Figure 4 validates the tiepoint and adjustment processes and the QuickBird sensor model. This type of “free

adjustment” (without ground control) can be used as a way to improve stereo vision. Addition of ground control also ties the stereo pair to the earth.

Before and after block adjustment for the entire 4-image block data is shown in Table 2. For this adjustment, 1

GCP and 32 tiepoints were used. Two GCPs were used as tiepoints to compute the RMS errors. The adjustment required 4 iterations and completed in under 15 seconds, computing 24 sensor parameters and the coordinates of all tiepoints and checkpoints.

Pre-Adjustment RMS

Latitude Longitude Height 696.1 meters 36.6 meters 9.49 meters

Post-Adjustment RMS 7.8 meters 15.0 meters 6.3 meters

Table 2. QuickBird image block adjustment analysis

Page 9: MULTI-SENSOR TRIANGULATION ABSTRACT · 2012-05-24 · RemoteView triangulation with adjustable RPC model. The kit includes results calculated by Space Imaging using the RPC sensor

ASPRS 2005 Annual Conference Baltimore, Maryland ♦ March 7-11, 2005

OrbView Stereo Pair

A pair of OrbView images was obtained from ORBIMAGE for purposes of validating the RemoteView block adjustment process and implementation of the OrbView rigorous sensor model. The imagery was located in the Phoenix area. Ground control points were provided in the form of a spreadsheet and associated graphics and TIFF images. Pre-adjustment RMS (measured at 3 checkpoints) is 5.26 meters and 10.2 meters in latitude and longitude respectively. After adjustment, the RMS is 3.7 meters and 9.4 meters, latitude and longitude respectively.

REFERENCES

Dial, G., Grodecki, J. (2003).Block Adjustment of High Resolution Satellite Images Described by Rational Polynomials. Photogrammetric Engineering and Remote Sensing, 69:59-68.

Slama, C. (1980), (Editor) Manual of Photogrammetry, Fourth Edition. Falls Church, VA., pp. 88-101. Toutin, Th. and Cheng, P.(2002), Demystification of IKONOS. Earth Observation Magazine. 9(7):17-21. Wolf, P., Dewitt B. (2000), Elements of Photogrammetry. USA, pp. 366-403.