154
11/4/2015 Remote Sensing By Mujuni Terry Overview of topics in this course Introduction EM energy and RS Sensors and platforms, characteristics Multi-spectral Scanners Aerial Cameras, characteristics and missions Radiometric aspects Geometric aspects Image enhancement Visual interpretation Digital Image Classification Specials: Radar, Lidar 2 1

Remote Sensing

Embed Size (px)

DESCRIPTION

Remote sensing methods, which are based on the use of image data acquired by a sensor such as aerial cameras, scanners or a radar.

Citation preview

Page 1: Remote Sensing

11/4/2015

Remote Sensing

By Mujuni Terry

Overview of topics in this course

Introduction

EM energy and RS

Sensors and platforms, characteristics

Multi-spectral Scanners

Aerial Cameras, characteristics and missions

Radiometric aspects

Geometric aspects

Image enhancement

Visual interpretation

Digital Image Classification

Specials: Radar, Lidar

2

1

Page 2: Remote Sensing

11/4/2015

Lecture 1

3

Introduction In principle, there are two main categories of spatial data acquisition:

•ground-based methods such as making field observations, taking in situ measurements and performing land surveying. Using ground-based methods, you operate in the real world environment

4

2

Page 3: Remote Sensing

11/4/2015

•remote sensing methods, which are based on the use of image data acquired by a sensor such as aerial cameras, scanners or a radar. Taking a remote sensing approach means that information is derived from the image data, which form a representation of the real world. Thus, the strict division between ground-based and remote sensing methods is blurring.

5

Introduction

6 3

Page 4: Remote Sensing

11/4/2015

•Remote sensing is the science of acquiring, processing and interpreting images that record the interaction between electromagnetic energy and matter (Sabins, 1996).

•Remote sensing is the science and art of obtaining information about an object, area, or phenomenon through the analysis of data acquired by a device that is not in contact with the object, area, or phenomenon under investigation (Lillesand et al., 2004).

•Remote sensing is the instrumentation, techniques and methods to observe the Earth’ssurfaceat a distance and to interpret the images or numerical values obtained in order to acquire meaningful information of particular objects on Earth (Buiten and Clevers, 1993).

7

Applications of Remote Sensing

The image data acquired by remote sensing (RS) relate to electromagnetic properties of the Earth.

Under many conditions, the data can be related to real world parameters or features.

For example, measurements in the thermal infrared wavelength can be used directly to calculate the surface temperature.

8

4

Page 5: Remote Sensing

11/4/2015

Although remote sensing data can be interpreted and processed without other information, the best results are obtained by linking remote sensing measurements to ground (or surface) measurements and observations.

Highly accurate sea surface temperature (SST) maps can be derived from satellite image data when combined with in situ temperature measurements (by buoy).

9

10

5

Page 6: Remote Sensing

11/4/2015

The spatial pattern of SST can best be studied using map-like products. These can be obtained by interpolation from in situ measurements by buoys.

However, it is unlikely that the number and distribution of buoys is adequate to enable detection/identification of relevant surface features.

11

Using data of meteorological satellites, SST can be assessed at a level of detail up to 1km.

RS derived SST data can, for example, be used to determine an optimal density and location of buoys.

12 6

Page 7: Remote Sensing

11/4/2015

13

SST maps are not only of interest to researchers, but also to large fishing companies that want to guide their vessels to promising fishing grounds.

A data set, thus, can be of use for more than one organization and can also prove useful for historical studies.

14 7

Page 8: Remote Sensing

11/4/2015

15

Installation and maintenance of buoys costs a lot of money. Meteorological satellites have already been paid for and the data can be considered ‘free’.Remotesensing would thus be a cheap technique.

However, as RS can only be used to measure surface characteristics, buoys still need to be placed to determine subsurface characteristics.

16 8

Page 9: Remote Sensing

11/4/2015

17

In principle, remote sensing provides information about the upper few millimetres of the Ear surface. Some techniques, specifically in the microwave domain, relate to greater depths.

The fact that measurements only refer to the surface is a limitation of remote sensing. Additional ‘models’orassumptions are required to estimate subsurface characteristics.

18 9

Page 10: Remote Sensing

11/4/2015

19

The Pacific Ocean is known for its unfavorable weather and ocean conditions. It is quite difficult to install and maintain a network of measuring buoys in this region.

A RS approach is specifically suited for areas that are difficult to access.

20

10

Page 11: Remote Sensing

11/4/2015

A related topic is that of acquiring global or continental data sets.

RS allows data to be acquired globally using the same or a similar sensor. This enables methods for monitoring and change detection.

21

Since an increasing number of issues are of global concern, such as climate change,

environmental degradation, natural disasters

and population grows, synoptic RS has

become a vital tool, and for many applications,

the only one suitable.

22

11

Page 12: Remote Sensing

11/4/2015

Lecture 2

23

EM energy and Remote Sensing

24 12

Page 13: Remote Sensing

11/4/2015

Electromagnetic energy is considered

to propagate through space in the form of sine waves. These waves are

characterized by electrical (E) and

magnetic (M) fields, which are

perpendicular to each other. For this

reason, the term electromagnetic

energy is used.

The vibration of both fields is perpendicular to the direction of travel of the wave. Both fields propagate through space at the

speed of light c, which is approximately 3 x 108 m/s. 25

One characteristic of electromagnetic waves is particularly important for understanding remote sensing.

This is the wavelength: its the distance between successive wave crests. Wavelength is measured in metres (m),

nanometres (nm = 10−9 m) or

micrometres (μm = 10−6 m).

26 13

Page 14: Remote Sensing

11/4/2015

Frequency

If a wave of water goes by, while you are sitting in a boat, you go up and down at a given rate. This rate is the frequency of the wave.

Same analogy - with light waves, but instead of meters we talk about microns (or nano-meters).

27

Passive sensor Depends on energy (daylight, heat) Example: • Aerial camera • Thermal scanners

Active sensor Generates energy (illuminates) Example : • Radar • Laser scanners

28 14

Page 15: Remote Sensing

11/4/2015

Sunlight (white) is a composite light !

29

Most characteristics of EM energy can be described using the ‘wave’modelasdescribed above.

30 15

Page 16: Remote Sensing

11/4/2015

EM energy is more conveniently modelled by the particle theory,

in which EM energy is composed of discrete units called

‘photons’. 31

The later approach is taken when quantifying the amount of energy measured by a multispectral sensor. The amount of energy held by a photon of a specific wavelength is then given by;

where Q is the energy of a photon (J), h is

Planck’sconstant(6.6262 · 10−34 J s), and v the frequency (Hz).

16

Page 17: Remote Sensing

11/4/2015

33

All matter with a temperature above absolute zero

(K) radiates electromagnetic waves of various wavelengths. The total range of wavelengths is commonly referred to as the electromagnetic spectrum. It extends from gamma rays to radio waves

17

Page 18: Remote Sensing

11/4/2015

Remote sensing operates in several regions of the electromagnetic spectrum.

The optical part of the EM spectrum refers to that part of the EM spectrum in which optical phenomena of reflection and refraction can be used to focus the radiation.

35

18

Page 19: Remote Sensing

11/4/2015

The optical range extends from X-rays (0.02 μm) through the visible part of the EM spectrum up to and including far-infrared (1000 μm).

The ultraviolet (UV) portion of the spectrum has the shortest wavelengths that are of practical use for remote sensing.

37

This radiation is beyond the violet portion of the visible wavelengths. Some of the Ear surface materials, in particular rocks and minerals emit or fluoresce visible light when illuminated with UV radiation.

The microwave range covers wavelengths from 1 mm to 1 m.

38

19

Page 20: Remote Sensing

11/4/2015

The visible region of the spectrum is commonly called ‘light’.Itoccupies a relatively small portion in the EM spectrum.

It is important to note that this is the only portion of the spectrum that we can associate with the concept of color.

39

The longer wavelengths used for remote sensing are in the thermal infrared and microwave regions.

Thermal infrared gives information about surface temperature. Surface temperature can be related, for example, to the mineral composition of rocks or the condition of vegetation.

Microwaves can provide information on surface roughness and the properties of the surface

such as water content. 40

20

Page 21: Remote Sensing

11/4/2015 41

42

21

Page 22: Remote Sensing

11/4/2015

43

The Sky and its Colors

44 22

Page 23: Remote Sensing

11/4/2015

Rayleigh scattering predominates where electromagnetic radiation interacts with particles that are smaller than the wavelength of the incoming light.

Examples of these particles are tiny specks of

dust, and nitrogen (NO2) and oxygen (O2) molecules.

In the absence of particles and scattering, the sky would appear black. At daytime, the Sun rays travel the shortest distance through the atmosphere.

In that situation, Rayleigh scattering causes a clear sky to be observed as blue because this is the shortest wavelength the human eye can observe. At sunrise and sunset however, the Sun rays travel a longer distance through the Earth’satmosphere before they reach the surface. All the shorter wavelengths are scattered after some distance and only the longer wavelengths reach the Earth’ssurface.Asa result, the sky appears orange or red .

45

46

23

Page 24: Remote Sensing

11/4/2015

Mie scattering occurs when the wavelength of the incoming radiation is similar in size to the atmospheric particles. The most important cause of Mie scattering are aerosols: a mixture of gases, water vapor and dust.

Mie scattering influences the entire spectral region from the near-ultraviolet up to and including the near-infrared, and has a greater effect on the larger wavelengths than Rayleigh scattering.

47

Occurs when the particle

size is much larger than the radiation wavelength. Typical

particles responsible for this

effect are water droplets and

larger dust particles.

Non-selective scattering is independent of wavelength, with all wavelengths scattered about equally. The most prominent example

of non-selective scattering includes the effect of clouds (clouds consist of water droplets). Since all wavelengths are scattered equally, a cloud appears white.

48

24

Page 25: Remote Sensing

11/4/2015

Radiance and Emissivity

49

50

25

Page 26: Remote Sensing

11/4/2015 51

52

26

Page 27: Remote Sensing

11/4/2015

Consider a surface composed of a certain material. The energy reaching this surface is called irradiance.

The energy reflected by the surface is called radiance. Irradiance and radiance are expressed in watts per square metre per

steradian (Wm−2 sr−1). For each material, a specific reflectance curve can be established.

53

Such curves show the fraction of the incident radiation that is reflected as a function of wavelength.

From such curve, you can find the degree of reflection for each wavelength (e.g. at 0.4 μm,0.41 μm,0.42 μm,.. . ).

54

27

Page 28: Remote Sensing

11/4/2015

Most remote sensing sensors are sensitive to broader wavelength bands, for example from 0.4–0.8 μm,and the curve can be used to estimate the overall reflectance in such bands.

Reflectance measurements can be carried out in a laboratory or in the field using a field

spectrometer.

55

56

28

Page 29: Remote Sensing

11/4/2015

57

Lecture 3

58

29

Page 30: Remote Sensing

11/4/2015

Sensors and Platforms

59

60

30

Page 31: Remote Sensing

11/4/2015

Depending on the surface characteristics, electromagnetic energy from the Sun or active sensor is reflected, or energy may be emitted by the Earth itself. This energy is measured and recorded

by the sensors. The resulting data can be used to derive information about surface characteristics.

The measurements of electromagnetic energy are made by sensors that are attached to a static or moving platform.

Different types of sensors have been developed for different applications. Aircraft and satellites are generally used to carry one or more sensors .

61

Active and Passive Sensors

Passive sensors depend on an external source of energy, usually the Sun, and sometimes the Earth itself.

Current operational passive sensors cover the electromagnetic spectrum in the wavelength range from less than 1 picometer (gamma rays) to larger than 1 meter (microwaves). The oldest and most common type of passive sensor is the photographic camera.

62 31

Page 32: Remote Sensing

11/4/2015 63

64

32

Page 33: Remote Sensing

11/4/2015

65

Active sensors have their own source of energy. Measurements by active sensors are more controlled because they do not depend upon varying illumination conditions.

Active sensing methods include radar (radio detection and ranging), lidar (light detection and ranging) and sonar (sound navigation and ranging), all of which may be used for altimetry as well as imaging.

66 33

Page 34: Remote Sensing

11/4/2015 67

68

34

Page 35: Remote Sensing

11/4/2015

Radar imaging

69

Airborne remote sensing

Airborne remote sensing is carried out using different types of aircraft depending on the operational requirements and budget available. The speed of the aircraft can vary between 150 km/h and 750 km/h and must be carefully chosen in relation to the mounted sensor system.

The selected altitude influences the scale and the resolution of the recorded images.

70

35

Page 36: Remote Sensing

11/4/2015

Apart from the altitude, the airc orientation also affects the geometric characteristics of the remote sensing data acquired.

The orientation of the aircraft is influenced by wind conditions and can be corrected for to some extent by the pilot.

71

The orientation can be expressed by three different rotation angles relative to a reference path, namely roll, pitch and yaw.

A satellite position system and an Inertial Navigation System (INS) can be installed in the aircraft to measure its position and the three rotation angles at regular intervals.

Subsequently, these measurements can be used to correct the sensor data for geometric distortions resulting from altitude and orientation errors.

72

36

Page 37: Remote Sensing

11/4/2015 73

Today, most aircraft are equipped with

standard satellite navigation technology,

which yields positional accuracies better than 10m to 20m (horizontal) or 20m to 30m (horizontal plus vertical).

More precise positioning and navigation (1m to 5m) is possible using a technique called differential correction, which involves the use of a second satellite receiver.

74 37

Page 38: Remote Sensing

11/4/2015

This second system, called the base station, is located at a fixed and precisely known position.

Even better positional accuracies (1cm) can be achieved using more advanced equipment.

75

Space-borne remote sensing

Space-borne remote sensing is carried out using sensors that are mounted on satellites, space vehicles and space stations.

The monitoring capabilities of the sensor are to a large extent determined by the parameters of the satellite’sorbit.

76 38

Page 39: Remote Sensing

11/4/2015

In general, an orbit is a circular path described by the satellite in its revolution about the Earth.

Different types of orbits are required to achieve continuous monitoring (meteorology), global mapping (land cover mapping), or selective imaging (urban areas).

77

78

39

Page 40: Remote Sensing

11/4/2015

1 - Orbital altitude, which is the distance (in km) from the satellite to the surface of the Earth. Typically, remote sensing satellites orbit either at 150km to 1000km (low-earth orbit, or LEO) or at 36,000km (geostationary orbit or GEO) distance from the Earth’ssurface.

This altitude influences to a large extent the area that can be viewed (coverage) and the details that can be observed (resolution).

79

2 - Orbital inclination angle, which is the angle (in degrees) between the orbital plane and the equatorial plane.

The inclination angle of the orbit, together with the field of view of the sensor, determines the latitudes up to which the Earth can be observed. If the inclination is 60°, then the satellite flies over the Earth between the latitudes 60° north and 60° south.

80 40

Page 41: Remote Sensing

11/4/2015

If the satellite is in a low-earth orbit with an inclination of 60°, then it cannot observe parts of the Earth at latitudes above 60° north and below 60° south, which means it cannot be used for observations of the polar regions of the Earth.

81

3 - Orbital period, which is the time (in minutes) required to complete one full cycle.

For instance, if a polar satellite orbits at 806km mean altitude, then it has an orbital period of 101 minutes, and its ground speed of 23,700km/h is equal to 6.5km/s.

82 41

Page 42: Remote Sensing

11/4/2015

3 - Repeat cycle, which is the time (in days) between two successive identical orbits. The revisit time, the time between two subsequent images of the same area, is determined by the repeat cycle together with the pointing capability of the sensor.

Pointing capability refers to the possibility of the sensor-platform to look to the side, or fore and aft. The sensors that are mounted on SPOT, IRS and Ikonos have such a capability.

83

The following orbit types are most common for remote sensing missions:

1 - Polar orbit. An orbit with inclination angle between 80° and 100°. Such a polar orbit enables observation of the whole globe, also near the poles. The satellite is typically placed in orbit at 600 km to 1000 km altitude.

84

42

Page 43: Remote Sensing

11/4/2015

2 - Sun-synchronous orbit. This is a near-polar orbit chosen in such a way that the satellite always passes overhead at the same time.

To achieve this, the inclination angle must be carefully chosen between 98° to 99°.

85

In addition to day-time images, a sun-synchronous orbit also allows the satellite to record night-time images (thermal or radar) during the ascending phase of the orbit at the dark side of the Earth.

Examples of polar orbiting, sun-synchronous satellites are Landsat, SPOT and IRS.

86

43

Page 44: Remote Sensing

11/4/2015

3 - Geostationary orbit. This refers to orbits in which the satellite is placed above the equator (inclination angle: 0°) at an altitude of approximately 36,000 km.

Geostationary orbits are used for meteorological and telecommunication satellites.

87

Image data characteristics

Remote sensing image data are more than a picture, they are measurements of EM energy. Image data are stored in a regular grid format

(rows and columns). 88

44

Page 45: Remote Sensing

11/4/2015

A single image element is called a pixel. For each pixel, the measurements are stored as Digital Numbers (DN).

Typically, for each measured wavelength range, a separate dataset is stored, which is called a band or a channel, and sometimes a layer.

89

The quality of image data is primarily determined by the characteristics of the sensor-platform system.

The image characteristics can be categorized into:

•Spatial characteristics, which refer to the area measured.

90

45

Page 46: Remote Sensing

11/4/2015

•Spectral characteristics, which refer to the spectral wavelengths that the sensor is sensitive to.

•Radiometric characteristics, which refer to the energy levels that are measured by the sensor.

•Temporal characteristics, which refer to the time of the acquisition.

91

Each of these characteristics can be further specified by the extremes that are observed (coverage) and the smallest units that can be distinguished (resolution):

- Spatial coverage: which refers to the total area covered by one image. With multispectral scanners, this is proportional to the total field of view (FOV) of the instrument, which determines

the swath width on the ground.

92 46

Page 47: Remote Sensing

11/4/2015

- Spatial resolution: which refers to the smallest unit-area measured. This indicates the minimum detail of objects that can be distinguished.

- Spectral coverage: which is the total wavelength range observed by the sensor.

- Spectral resolution: which is related to the width of the spectral wavelength bands that the sensor is sensitive to.

94 47

Page 48: Remote Sensing

11/4/2015

- Dynamic range: which refers to the minimum and maximum energy levels that can be measured by the sensor. - Radiometric resolution: which refers to the smallest differences in levels of energy that can be distinguished by the sensor.

95

96

48

Page 49: Remote Sensing

11/4/2015

- Temporal coverage: is the span of time over which images are recorded and stored in image archives. - Revisit time: which is the (minimum) time between two successive image acquisitions over the same location on Earth. This is sometimes referred to as temporal resolution.

97

Lecture 4

98

49

Page 50: Remote Sensing

11/4/2015

Spatio-temporal characteristics

For the selection of the appropriate data type, it is necessary to fully understand the information requirements for a specific application.

Therefore, you have to analyze the spatio-temporal characteristics of the phenomena under investigation.

99

For example, a different type of image data is required for topographic mapping of an urban area than for studying changing global weather patterns.

In the case of urban area mapping, because cities contain much spatial detail, a spatial resolution of 10 cm may be required. Aerial photography and airborne digital scanners can fulfill this requirement.

100 50

Page 51: Remote Sensing

11/4/2015

Another consideration in your data selection

process is the third dimension, i.e. the height

or elevation component.

Stereo images or interferometric radar data

can provide height information.

101

The time of image acquisition should also be given thought. In aerial photography, a low Sun angle causes long shadows due to elevated buildings, which may obscure features that we are interested in.

To avoid long shadows, the aerial photographs could be taken around noon.

102

51

Page 52: Remote Sensing

11/4/2015

Also the type and the amount of cloud cover plays a role, and under certain conditions no images can be taken at all.

Persistent cloud cover may be the reason for lack of image data in areas of the tropical belt. The use of radar satellites may be a solution to this problem.

103

Yet another issue is seasonal cycles. In countries with a temperate climate, deciduous

trees bear no leafs in the winter and early spring, and therefore images taken during this time of year allow the best view on the infrastructure.

However, during the winter the Sun angle may be too low to take good images, because of long shadows.

104

52

Page 53: Remote Sensing

11/4/2015 105

106

53

Page 54: Remote Sensing

11/4/2015 107

108

54

Page 55: Remote Sensing

11/4/2015 109

110

55

Page 56: Remote Sensing

11/4/2015

111

A combination of a single detector plus a rotating mirror can be arranged in such a way that the detector beam sweeps in a straight line over the Earth across the track of the satellite at each rotation of the mirror .

In this way, the Earth’ssurfaceisscanned systematically line by line as the satellite moves forward.

112 56

Page 57: Remote Sensing

11/4/2015

Because of this sweeping motion, the whiskbroom scanner is also known as the across-track scanner.

Today, many scanners are still based on this principle (platform/-sensor), such as NOAA/AVHRR and Landsat/TM.

113

114

57

Page 58: Remote Sensing

11/4/2015

115

The pushbroom sensor

The pushbroom sensor is based on the use of charge-coupled devices (CCDs) for measuring the electromagnetic energy (Figure below).

A CCD-array is a line of photo-sensitive, solid state detectors. A single element can be as small as 5 μm.

116

58

Page 59: Remote Sensing

11/4/2015

The CCD-arrays used in remote sensing are more sensitive and have larger dimensions.

The first satellite sensor to use this technology

was SPOT-1 HRV. High resolution sensors

such as IKONOS and OrbView-3 also apply the

pushbroom principle.

118 59

Page 60: Remote Sensing

11/4/2015

The pushbroom sensor records one entire line at a time. The principal advantage over the whiskbroom scanner is that each position (pixel) in the line has its own detector.

This enables a longer period of measurement over a certain area, resulting in less noise and a relatively stable geometry.

119

Since the CCD-array builds up images by scanning entire lines along the direction of motion of the platform, the pushbroom sensor is also referred to as along-track scanner.

120

60

Page 61: Remote Sensing

11/4/2015

In pushbroom sensors, there is no need for a scanning mirror, and therefore they have a higher reliability and a longer life expectancy than whiskbroom scanners.

Because of this, together with the excellent

geometrical properties, pushbroom sensors

are used extensively in satellite remote

sensing. 121

122

61

Page 62: Remote Sensing

11/4/2015 124

62

Page 63: Remote Sensing

11/4/2015 126

63

Page 64: Remote Sensing

11/4/2015 128

64

Page 65: Remote Sensing

11/4/2015 129

130

65

Page 66: Remote Sensing

11/4/2015

Lecture 5

132

66

Page 67: Remote Sensing

11/4/2015 134

67

Page 68: Remote Sensing

11/4/2015

135

Aerial Camera

A camera used for vertical aerial photography for mapping purposes is called an aerial survey camera. In this section the standard aerial camera is introduced.

At present, there are only two major manufacturers of aerial survey cameras, namely Leica-Helava Systems (LH Systems) and Z/I Imaging.

136 68

Page 69: Remote Sensing

11/4/2015

These two companies produce the RC-30 and the RMK-TOP, respectively. There are also digital cameras, and the number of producers of digital aerial survey cameras is growing.

Just like a typical hand-held camera, the aerial survey camera contains a number of common components, as well as a number of specialized ones necessary for its specific

role. 137

The large size of the camera results from the need to acquire images of large areas with a high spatial resolution.

This is realized by using a very large film size. Modern aerial survey cameras produce negatives measuring 23 cm × 23 cm (9 by 9 inch).

138

69

Page 70: Remote Sensing

11/4/2015

Up to 600 photographs may be recorded on a single roll of film. For the digital camera to have the same kind of quality as the aerial film camera it would have to have about 200 million pixels.

Today’sdigitalcameras do not have so many pixels, but they do have other advantages.

139

140

70

Page 71: Remote Sensing

11/4/2015 141

142

71

Page 72: Remote Sensing

11/4/2015 143

72

Page 73: Remote Sensing

11/4/2015 146

73

Page 74: Remote Sensing

11/4/2015 148

74

Page 75: Remote Sensing

11/4/2015

Note: modern digital aerial cameras combine the advantages of both 149

75

Page 76: Remote Sensing

11/4/2015 151

152

76

Page 77: Remote Sensing

11/4/2015 153

154

77

Page 78: Remote Sensing

11/4/2015 155

156

78

Page 79: Remote Sensing

11/4/2015 157

158

79

Page 80: Remote Sensing

11/4/2015

159

Lecture 6

160

80

Page 81: Remote Sensing

11/4/2015

Scale

162

81

Page 82: Remote Sensing

11/4/2015

The relationship between the photo scale factor, s,

flying height, H, and lens focal length, f, is given by

s = H/f

Hence, the same scale can be achieved with

different combinations of focal length and flying

height.

If the focal length of a lens is decreased, while the

flying height remains constant, then;

163

82

Page 83: Remote Sensing

11/4/2015

a) The image scale factor will increase and the size of the individual details in the image becomes smaller.

Using a 150 mm and 300 mm lens at H =2000 m results in a scale factor of 13,333 and 6,666, respectively.

165

b) The ground coverage increases. A 23 cm negative covers a length (and width) of respectively 3066 m and 1533 m using a 150 mm and 300 mm lens.

This has implications on the number of photos required for the mapping of a certain area, which, in turn, affects the subsequent processing (in terms of labor) of the photos.

166 83

Page 84: Remote Sensing

11/4/2015

c) The angular field of view increases and the image perspective changes. The total field of view in situations and is 74° and 41°, respectively.

When wide-angle photography is used for mapping, the measurement of height information in a stereoscopic model is more accurate than when long focal length lenses

are used. 167

The combination of a low flying height with a wide-angle lens can be problematic when there are large terrain height differences or high man-made objects in the scene. Some areas may become ‘hidden’behindtaller objects.

This phenomenon of occlusion is called the dead ground effect. This effect can be clearly seen in

the Figure. 168

84

Page 85: Remote Sensing

11/4/2015

The aerial camera is fitted with a system to record various items of relevant information onto the side of the negative: mission identifier, date and time, flying height and the frame number.

A vacuum plate is used for flattening the film at the instant of exposure. So called fiducial marks are recorded in all corners and/or on the sides of the film.

The fiducial marks are required to determine the optical centre of the photo, needed to align photos for stereo viewing.

170

85

Page 86: Remote Sensing

11/4/2015

The fiducials are also used to record the precise position of the film in relation to the optical system, which is required in photogrammetric processes.

Modern cameras also have an image motion compensation facility.

171

86

Page 87: Remote Sensing

11/4/2015 174

87

Page 88: Remote Sensing

11/4/2015

Relief displacement

A characteristic of most sensor systems is the distortion of the geometric relationship between the image data and the terrain, caused by relief differences on the ground.

176

88

Page 89: Remote Sensing

11/4/2015

This effect is most apparent in aerial photographs and airborne scanner data. The effect of relief displacement is illustrated in Figure below. Consider the situation on the left, in which a true vertical aerial photograph is taken of a flat terrain.

In this equation, r is the radial distance (mm) from the nadir, h (m) is the height of the terrain above the reference plane, and H (m) is the flying height above the reference plane (where nadir intersects the terrain).

Relief displacement shown in image 89

Page 90: Remote Sensing

11/4/2015

The distances (A − B) and (a − b) are proportional to the total width of the scene and its image size on the negative, respectively.

In the left hand situation, by using the scale factor, we can compute (A − B) from a measurement of (a −b) in the negative. In the right hand situation, there is significant terrain relief difference.

179

The equation shows that the amount of relief displacement is zero at nadir (r = 0), greatest at the corners of the photograph, and is inversely proportional to the flying height.

180 90

Page 91: Remote Sensing

11/4/2015 181

91

Page 92: Remote Sensing

11/4/2015

While scale is a generally understood and applied term, the use of ‘spatialresoin aerial photography is quite difficult.

Spatial resolution refers to the ability to distinguish small adjacent objects in an image.

183

The spatial resolution of monochrome aerial photographs ranges from 40 to 800 line pairs per mm.

The better the resolution of a recording system, the more easily the structure of objects on the ground can be viewed in the image.

184

92

Page 93: Remote Sensing

11/4/2015

The spatial resolution of an aerial photograph depends on:

• the image scale factor: spatial resolution decreases as the scale factor increases,

•the quality of the optical system: expensive high quality aerial lenses give much better performance than the inexpensive lenses on amateur cameras,

•the grain structure of the photographic film: the larger the grains, the poorer the resolution,

185

•the contrast of the original objects: the

higher the target contrast, the better the

resolution,

•atmospheric scattering effects: this leads to

loss of contrast and resolution,

•image motion: the relative motion between

the camera and the ground causes blurring

and loss of resolution.

186 93

Page 94: Remote Sensing

11/4/2015

From the above list it can be concluded that the physical value of resolution in an aerial photograph depends on a number of factors.

The most variable factor is the atmospheric condition, which can change from mission to mission, and even during a mission.

187

188

94

Page 95: Remote Sensing

11/4/2015 189

95

Page 96: Remote Sensing

11/4/2015 191

192

96

Page 97: Remote Sensing

11/4/2015 193

194

97

Page 98: Remote Sensing

11/4/2015 195

196

98

Page 99: Remote Sensing

11/4/2015 197

198

99

Page 100: Remote Sensing

11/4/2015 199

200

100

Page 101: Remote Sensing

11/4/2015

Stereoscopic vision

The impression of depth encountered in the real world can also be realized by images of the same object that are taken from different positions.

Such a pair of images (photographs or digital images) is separated and observed at the same time by both eyes.

201

These place images on the retinas of the

viewer’seyesinwhich, objects at different

positions in space are projected on relatively

different positions.

We call this stereoscopic vision. Pairs of

images that can be viewed stereoscopically

are called stereograms.

202 101

Page 102: Remote Sensing

11/4/2015

Stereoscopic vision is explained here because

the impression of height and height differences

is important in the interpretation of both natural and man-made features from image data.

203

Under normal conditions, the human eye can focus on objects between 150mm distance and infinity. In doing so we direct both eyes to the object (point) of interest.

This is known as convergence, and is how humans normally see in three dimensions.

204

102

Page 103: Remote Sensing

11/4/2015

Human vision

Brain

We can see three-dimensionally partly because we have two eyes,

which enable us to see a scene simultaneously from two viewpoints.

The brain fuses the stereoscopic views into a three-dimensional

impression (together with sensing what appears taller and what

smaller, what is partially obscured, etc) 205

Viewing a 3D object

206

103

Page 104: Remote Sensing

11/4/2015

The left eye’s

207

The left eye’s vi

208

104

Page 105: Remote Sensing

11/4/2015

The right eye’s l

209

The right eye’s v

210

105

Page 106: Remote Sensing

11/4/2015

The stereoscopic views

211

give a 3D impression

Stereoscopy: seeing of objects in three dimensions

212 106

Page 107: Remote Sensing

11/4/2015

To view the stereoscopic model formed by a pair of overlapping photographs, the two images have to be separated so that the left and right eyes see only the left and right photographs, respectively.

In addition one should not focus on the photo itself but at infinity.

213

Some experienced persons can experience ‘stereo’by putting the two photos at a suitable distance from their eyes.

Most of us need some help and different methods have been developed.

Pocket and mirror stereoscopes, and also the photogrammetric plotters, use a system of lenses and mirrors to ‘feed’oneimage into

one eye. 214

107

Page 108: Remote Sensing

11/4/2015

Pocket and mirror stereoscopes (Figure below) are mainly applied in mapping applications related to vegetation, forest, soil and geomorphology.

Mirror stereoscope 215 Pocket stereoscope

Photogrammetric plotters are used in topographic and large scale mapping activities.

Another way of achieving stereovision is to project the two images in two colors.

216 108

Page 109: Remote Sensing

11/4/2015

Most often red and cyan colors are applied;

the corresponding spectacles comprise one

red and one cyan glass.

This method is known as the anaglyph

system and is particularly suited to viewing overlapping images on a computer screen.

An approach used in digital photogrammetric systems is to apply different polarizations for the left and right images.

Polarized spectacles make the left image visible to the left eye and the right image to

the right eye. 218

109

Page 110: Remote Sensing

11/4/2015

Is indirect stereoscopic viewing possible?

Stereography was first described in 1832 by the English physicist Charles Wheatstone

We can also get a 3D impression of a scene by first taking two photos and then viewing the stereograph.

219

A stereograph fuses the left and right image of a stereo pair.

We then need a device which helps us to see the left image with the left eye and the right one with the right eye. A pair of anaglyph glasses is one of the most widely used such devices.

220

110

Page 111: Remote Sensing

11/4/2015

Stereoscopic viewing

The differences of the images (parallax) create the impression of difference in distance.

221

Anaglyph viewing

the left and right image are printed in red and cyan respectively

222 111

Page 112: Remote Sensing

11/4/2015

Central projection: terrain to image

Take terrain point

Connect with projection center

Image point found Extend this line

Intersect with image plane

223

Central projection: image to terrain

Take image point Connect with projection center Extend this line To where ????

Terrain point not found ?

???

?

224

112

Page 113: Remote Sensing

11/4/2015

Stereo Pair Take the image of the same

terrain point in the left and

in the right image Connect each one with

its projection center Extend these lines

and intersect them

Terrain point found

225

226

113

Page 114: Remote Sensing

11/4/2015 227

228

114

Page 115: Remote Sensing

11/4/2015

Aerial Photograph Missions

Mission Planning

When a mapping project requires aerial photographs, one of the first tasks is to select the required photo scale factor, the type of lens to be used, the type of film to be used and the required percentage of overlap for stereo viewing.

229

Forward overlap usually is around 60%, while sideways overlap typically is around 20%.

Figure below shows a survey area that is covered by a number of flight lines.

Furthermore, the date and time of acquisition should be considered with respect to growing season, light conditions and shadowing effects.

230

115

Page 116: Remote Sensing

11/4/2015

If the required scale is defined, the following parameters can be determined:

•the flying height required above the terrain, the ground coverage of a single photograph,

•the number of photos required along a flight line,

•the number of flight lines required.

After completion of the necessary calculations, mission maps are prepared for use by the survey navigator in flight.

231

232

116

Page 117: Remote Sensing

11/4/2015

Mission execution

The advent of Satellite Positioning Systems, especially GPS has caused a tremendous change to survey flight missions.

Today a computer program determines after entering a number of relevant mission parameters and the area of interest, the (3D) coordinates of all positions from which photographs are to be taken, and stored in a data base.

233

On board, the crew can obtain all relevant information from that database, such as project

area, camera and type of film to be used, and

number of images, constraints regarding time

of day or sun angle, season, and atmospheric

conditions.

234

117

Page 118: Remote Sensing

11/4/2015

Also the list of camera positions is loaded to a guidance system.

The pilot is then guided along the flight lines such that the deviation from the ideal line and the time to the next exposure station is shown on a display (together with other relevant parameters).

235

If the airplane passes ‘closeenoughtothe predetermined exposure station, the camera is fired automatically at the nearest position.

This allows having the data of several projects on board and making the choice of project (or project part) according to the local weather conditions.

236 118

Page 119: Remote Sensing

11/4/2015

If necessary, one can also abandon a project and resume it later.

The viewing system is still used to line out the camera and for visual checks like the downward visibility.

237

238

119

Page 120: Remote Sensing

11/4/2015 239

240

120

Page 121: Remote Sensing

11/4/2015 241

242

121

Page 122: Remote Sensing

11/4/2015 243

244

122

Page 123: Remote Sensing

11/4/2015 245

246

123

Page 124: Remote Sensing

11/4/2015 247

248

124

Page 125: Remote Sensing

11/4/2015

249

Radiometric aspects of RS data

250

125

Page 126: Remote Sensing

11/4/2015 251

252

126

Page 127: Remote Sensing

11/4/2015

253

Periodic line dropouts occur due to recording problems when one of the detectors of the sensor in question either gives wrong data or stops functioning.

The Landsat ETM, for example, has 16 detectors in all its bands, except the thermal band.

254 127

Page 128: Remote Sensing

11/4/2015

A loss of one of the detectors would result in

every sixteenth scan line being a string of

zeros that would plot as a black line on the

image (see Figure).

255

The first step in the restoration process is to calculate the average DN-value per scan line for the entire scene.

The average DN-value for each scan line is then compared with this scene average.

Any scan line deviating from the average by more than a designated threshold value is identified as defective.

256 128

Page 129: Remote Sensing

11/4/2015

In regions of very diverse land cover, better results can be achieved by considering the histogram for sub-scenes and processing these sub-scenes separately.

257

The next step is to replace the defective lines.

For each pixel in a defective line, an average DN is calculated using DNs for the corresponding pixel in the preceding and succeeding scan lines.

The average DN is then substituted for the defective pixel.

258 129

Page 130: Remote Sensing

11/4/2015 259

130

Page 131: Remote Sensing

11/4/2015

Line striping

Line striping is far more common than line dropouts. Line striping often occurs due to non-identical detector response.

Although the detectors for all satellite sensors are carefully calibrated and matched before the launch of the satellite, with time the response of some detectors may drift to higher or lower levels.

261

As a result, every scan line recorded by that detector is brighter or darker than the other lines (see Figure).

It is important to understand that valid data are present in the defective lines, but these must be corrected to match the overall scene.

131

Page 132: Remote Sensing

11/4/2015

Though several procedures can be adopted to correct this effect, the most popular is the histogram matching.

Separate histograms corresponding to each detector unit are constructed and matched.

263

Taking one response as standard, the gain (rate of increase of DN) and offset (relative shift of mean) for all other detector units are suitably adjusted, and new DN-values are computed and assigned.

This yields a destriped image in which all DN-values conform to the reference level and scale.

264 132

Page 133: Remote Sensing

11/4/2015 265

266

133

Page 134: Remote Sensing

11/4/2015 267

268

134

Page 135: Remote Sensing

11/4/2015

Random noise or spike noise

The periodic line dropouts and striping are forms of non-random noise that may be recognized and restored by simple means.

Random noise, on the other hand, requires a more sophisticated restoration method such as digital filtering.

269

Random noise or spike noise may be due to errors

during transmission of data or to a temporary

disturbance.

Here, individual pixels acquire DN-values that are

much higher or lower than the surrounding pixels

(see Figure).

135

Page 136: Remote Sensing

11/4/2015

In the image, these pixels produce bright and dark spots that interfere with information extraction procedures.

A spike noise can be detected by mutually comparing neighboring pixel values.

271

If neighboring pixel values differ by more than a specific threshold margin, it is designated as a spike noise and the DN is replaced by an interpolated DN-value (based on the values of the surrounding pixels).

272

136

Page 137: Remote Sensing

11/4/2015 273

274

137

Page 138: Remote Sensing

11/4/2015 275

138

Page 139: Remote Sensing

11/4/2015

139

Page 140: Remote Sensing

11/4/2015 279

280

140

Page 141: Remote Sensing

11/4/2015

141

Page 142: Remote Sensing

11/4/2015

142

Page 143: Remote Sensing

11/4/2015 285

286

143

Page 144: Remote Sensing

11/4/2015 288

144

Page 145: Remote Sensing

11/4/2015

Introduction

We are able to use remotely sensed data to:

1. Derive 2D and 3D information.

One of the most important applications of remote sensing is to use photographs or images as an intermediate product from which maps are made.

2D geometric descriptions of objects (points, lines, areas) can be derived relatively easily from a single image or photo through the process of georeferencing.

289

2. Visualize the image.

Sometimes it can be helpful to combine image data with vector data, for example to study cadastral (land) boundaries in the context of land cover.

In such cases, the raster image data form a backdrop for the vector data. To enable such integration, the image data must be georeferenced to the coordinate system of the vector data.

290 145

Page 146: Remote Sensing

11/4/2015

3. Merge different types of image data.

Instead of using them to make maps, image data may be incorporated directly as raster layers in a GIS.

For example, in a monitoring project, it might be necessary to compare land cover maps and classified multispectral Landsat and SPOT data of various dates.

291

All these different data sets need to be georeferenced to the same coordinate system before they can be processed simultaneously.

That means that the pixels of all raster data sets have to be geocoded. This is a rather complex process in which the various sets of georeferenced pixels are resampled, i.e. rearranged to a common system of columns and rows.

292 146

Page 147: Remote Sensing

11/4/2015

Georeferencing

The simplest way to link an image to a map projection system is to use a geometric transformation. A transformation is a function that relates the coordinates of two systems.

A transformation relating (x, y) to (i, j) is typically defined by linear equations, such as: x = 3 + 5i, and, y = −2 + 2.5j.

293

294

147

Page 148: Remote Sensing

11/4/2015 295

296

148

Page 149: Remote Sensing

11/4/2015 297

298

149

Page 150: Remote Sensing

11/4/2015

Using the above transformation, for example, image position (i = 3, j = 4) relates to map coordinates (x = 18, y = 8).

Once such a transformation has been determined, the map coordinates for each image pixel can be calculated.

299

Images for which such a transformation has been carried out are said to be georeferenced.

It allows the superimposition of vector data and the storage of the data in map coordinates when applying on-screen digitizing.

300

150

Page 151: Remote Sensing

11/4/2015

Note that the image in the case of georeferencing remains stored in the original (i, j) raster structure, and that its geometry is not altered

Transformations can also be used to change the actual shape of imagery, and thus make it geometrically equivalent to a ‘true’map.

301

The process of georeferencing involves two steps:

• selection of the appropriate type of transformation, and

• determination of the transformation

parameters.

302

151

Page 152: Remote Sensing

11/4/2015

The type of transformation depends mainly on the sensor-platform system used.

For aerial photographs (of a flat terrain), usually the so-called perspective transformation is used to correct for the effect of pitch and roll.

303

304

152

Page 153: Remote Sensing

11/4/2015

A more general type is the polynomial

transformation, which enables 1st, 2nd to nth order transformations.

In many situations a 1st order transformation is

adequate. A 1st order transformation relates map coordinates (x, y) with image coordinates (i, j) as follows:

x = a + bi + cj y = d + ei + fj

305

306

153

Page 154: Remote Sensing

11/4/2015

The transformation parameters can be determined by means of ground control points (GCPs). GCPs are points that can be clearly identified in the image and in a source that is in the required map projection system.

One possibility is to use topographical maps of an adequate scale.

307

308

154