Upload
others
View
1
Download
0
Embed Size (px)
Citation preview
1
PRINCIPLES OF REMOTE SENSING
Introduction
Definition
Remote sensing means acquiring information about a phenomenon, object or surface while
at a distance from it. This name is attributed to recent technology in which satellites and
spacecraft are used for collecting information about the earth's surface. This was an outcome of developments in various technological fields from 1960 onward.
Stages in Remote Sensing
Emission of electromagnetic radiation, or EMR (sun/self- emission)
Transmission of energy from the source to the surface of the earth, as well as
absorption and scattering
Interaction of EMR with the earth's surface: reflection and emission
Transmission of energy from the surface to the remote sensor
Sensor data output
Data transmission, processing and analysis
What We See
At temperature above absolute zero, all objects radiate electromagnetic energy by virtue of their atomic and molecular oscillations. The total amount of emitted radiation increases
with the body's absolute temperature and peaks at progressively shorter wavelengths.
The sun, being a major source of energy, radiation and illumination, having a sharp power
peak around 0.5 µm, allows to capture reflected light with conventional (and some not-so-
conventional) cameras and films.
The basic strategy for sensing electromagnetic radiation is clear. Everything in nature has its own unique distribution of reflected, emitted and absorbed radiation. These spectral
characteristics, if ingeniously exploited, can be used to distinguish one thing from another
or to obtain information about shape, size and other physical and chemical properties. In
so far as we know the spectral characteristics, we can pick an appropriate detector to
2
make the desired measurement, remembering that for a given collector's diameter we get
our greatest spatial resolution where wavelengths are shortest and energies greatest, and
that these energies decrease at longer wavelengths and distances.
Modern Remote Sensing Technology versus Conventional Aerial Photography
The use of different and extended portions of the electromagnetic spectrum,
development in sensor technology, different platforms for remote sensing (spacecraft, in
addition to aircraft), emphasis on the use of spectral information as compared to spatial
information, advancement in image processing and enhancement techniques, and automated image analysis in addition to manual interpretation are points for comparison
Modern Remote Sensing Technology versus Conventional Aerial Photography
The use of different and extended portions of the electromagnetic spectrum,
development in sensor technology, different platforms for remote sensing (spacecraft, in
addition to aircraft), emphasis on the use of spectral information as compared to spatial
information, advancement in image processing and enhancement techniques, and automated
image analysis in addition to manual interpretation are points for comparison of
conventional aerial photography with modern remote sensing system.
During early half of twentieth century, aerial photos were used in military surveys and topographical mapping. Main advantage of aerial photos has been the high spatial
resolution with fine details and therefore they are still used for mapping at large scale
such as in route surveys, town planning, construction project surveying, cadastral mapping
etc. Modern remote sensing system provide satellite images suitable for medium scale
mapping used in natural resources surveys and monitoring such as forestry, geology,
watershed management etc. However the future generation satellites are going to provide
much high resolution images for more versatile applications.
Remote Sensing in India
Indian remote sensing programme was started in late seventies with Indian built
Bhaskara-I and Bhaskara-II satellites launched from Soviet Union in June 1978 and
November, 1981 respectively. Indigenous launch capability have also been developed to
launch 1000 Kg. Remote Sensing Satellite into polar orbit. The Department of Space
(DOS)/ Indian Space Research Organisation (ISRO) has been actively engaged in development of state of art remote sensing capablities along with other related activities.
ISRO Satellite Centre (ISAC), Bangalore is responsible for design, development and
management of remote sensing satellites. Space Application Centre (SAC), Ahmedabad is
engaged in development of sensors ( cameras & scanners) & data processing software,
analysis of remote sensing data, and related research. National Remote Sensing Agency
3
(NRSA), Hyderabad is responsible for acquisition, processing and dissemination of remote
sensing data, analysis of data for different applications and training of users. The National
Natural Resources Management System (NNRMS) is responsible for execution of
application projects through the establishment of a no. of Regional Remote Sensing
Service Centre (RRSSC) throughout the country. Many user agencies, Govt. Departments,
State Governments, and academic institutes have also established remote sensing
infrastructure for various applications.
HISTORY OF DEVELOPMENT OF REMOTE SENSING
Remote sensing was evolved as the ability of man to observe in regions of
electromagnetic spectrum, beyond the range of human vision. The development of remote
sensing involved the development of conventional photography, multispectral & infrared
photography, nonphotographic sensors & scanners, platforms for remote sensing such as
aircraft & satellite, communication & data transmission, data processing and computer
technology.
The first photograph, a permanent impression of image created by light, was
reportedly made by Daguerre & Niepce (France) in the year 1839. In the year 1858 a
French photographer, Gaspard Felix Tournachon, and Laussedat attempted succesfully to
take aerial photographs from a balloon over the city of Paris with a camera using wet plate
process. In 1871 dry photographic film, with gelatine emulsion of silver halide grains was available. In 1903, pigeons with camera mounted on them were also used for taking aerial
photographs. Many experiments were also conducted for using balloons, kites and rockets
for reconnaissance purpose during world war I. This was the time when importance of
aerial recconaissance was recognised for military purpose. The first recorded
photographs from aeroplane were taken by Wilbur Wright on April 24, 1909 over
Centocelli, Italy.
During the period between world war I and II, many developments and improvements took place in the technology of photography, film, camera, processing
equipment, interpretation of photographs and related techniques. Photogrammetry &
Photointerpretation was recognised as a specialisation. American Society of
Photogrammetry published its official journal, named "Photogrammetric Engineering" first
time in the year 1934. Many non-military applications also started during this period in
America and western world, such as in geology, agriculture, forestry and landuse. Many new
photogrammetric instuments were also developed such as Zeiss Stereoplanigraph (1923), Zeiss Multiplex (1920), Bausch & Lomb Multiplex (1930), Kelsh Plotter (1940). The
European manufacturers such as Wild (Switzerland), Santoni, Mistri(Italy),
Poivilliers(France), Hilger-Watts(U.K.) also brought out their similar instruments.
4
In the year 1918, "Air Shipping and Transport Company" was established in India
by British Government for carrying out the jobs such as aerial photography for
reconnaissance and mapping. The same company was renamed as "Air Survey Company of
India" in the year 1954, with Headquarters at Dumdum, Calcutta. The same year, Survey
of India recieved a number of Photogrammetric instruments under United Nations Aid
programme and persons were trained to do photogrammetric mapping.
During the world war II, aerial photographs were extensively used for topographic
mapping and military intelligance. The German, British, French and later American and Japanese forces established their aerial reconnaissance units for extensive survillance of
enemy areas. Thousands of photographs were processed almost overnight during
operations. Aerial photography to observe movement of enemy forces was done repeatedly
at an interval of few days, and in some cases, flights were flown even in morning and
evening. Large number of photointerpreters were trained by army. During this period,
aerial photographs were also used for non military applications such as geology,
forestry, soil survey, route alignment etc.
After the world war, many of the technologies developed during war time were
transferred to civil areas of application. However new developments still continued. Most
important of them have been launching of spacecraft, development of non photographic
sensors, operating in ultra-violet, infra-red, thermal and microwave regions of electromagnetic regions, satellite communication and data processing on computers. In the
year 1957 first artificial satellite "Sputnik I" was launched and in 1961 manned as well as
unmanned spacecraft were used to take orbital photographs. Tiros, Mercury, Gemini, and
Apollo series of satellites are examples of such experiments.
The name "remote sensing" as first used in the early 1960's, when it was felt
necessary to find a suitable name for the new type of "images" obtained from a completely
different type of "cameras",(known as scanners), formed using energy outside the visible spectrum as the term "photograph" was no longer suitable to describe new kind of images
(also called as imageries). The launch of LANDSAT 1 by NASA in 1972 was another
landmark in the history of remote sensing which was the result of efforts started in the
1960's and beginning of modern Remote Sensing. LANDSAT 1 ("LAND SATELLITE") was
initially called as ERTS 1 (Earth Resources Technology Satellite) was the first of several
earth-orbiting satellites designed specifically for observation of the earth's land areas.
LANDSAT provided for the first time systematic, repetitive observation of the earth's land areas.
Some Important Events in the History of Remote Sensing
1800 Infrared portion of electromagnetic spectrum discovered.
5
1801 Ultraviolet part of spectrum discovered.
1839 Invention of photography by Daguerre.
1847 Characteristics of infrared radiation discovered
1858 Photography from balloons by Tournachon & Laussedat using wet
plate camera.
1871 Dry Film Photography invented.
1873 Maxwell developed the theory of electromagnetism.
1895 Color Photography done by color separation method.
1898 Mathematical analysis of photograph for converting perspective view
to orthographic projection for mapping done by Laussedat.
1909 Photography by aeroplanes was started.
1914-18 Aerial reconnaissance started during World War I.
1920-30 Aerial photography and photogrammetry developed for military
recconnaisence and mapping. Optical and Optical-mechanical
Photogrammetric instruments were developed.
1931 Black & White Infrared sensitive film available.
1935 First color film "Kodachrome" was available.
1930-40 Development of Radar for air traffic control
1942 Kodacolor Aero Reversal Film available.
1942 First Color Infrared film was available.
1939-45 Images in nonvisible portionsof Electromagnetic spectrum(thermal and
microwave) used during World War II. Specialised training in Acquisition and
interpretation of airphotos was started.
_
6
1940-50 Military research and development activities took place. Mechanical
Projection type Photogrammetric instruments developed
1950-60 Development of SLAR & SAR, Imaging Radar System
1956 Color Infrared film was first used in agriculture.
1960-70 The word "Remote Sensing" was used first time.
1960 Launch of TIROS-1 Weather Satellite
1972 Launch of LANDSAT 1
1973 Launch of SKYLAB, the first American space station
1970-80 Rapid advances in digital image processing due to powerful
microprocessor based computers available
1975 Manual of Remote Sensing, (First Edition) published by American Society
of Photogrammetry.
1978 Launch of SEASAT-1 as Oceanographic research satellite
1980 Launch of USSR METEOR Satellite
1980-90 Launch of LANDSAT 4 , second generation of LANDSAT series with new
sensor
1986 Launch of French Earth Observation Satellite (SPOT)
Some Important Events in Indian Remote Sensing Programme
1972 Establishment of Department of Space.
1979 June 7, Launch of EarthObservation Satellite BHASKARA by ISRO.
1988 March 17, Launch of Indian Remote Semnsing Satellite (IRS-1A) by
Russian Rocket "Vostok".
7
1994 Oct.15, Launch of IRS-P2 by indigenously developed Polar Satellite
Launch Vehicle (PSLV-D2).
1995 Dec.28, Launch of new generation IRS-1C Satellite.
1997 Sept 29 Launch of IRS-1D Satellite
1999 May 26 Launch of IRS-P4 - OCEANSAT
Electromagnetic Radiation and the Electromagnetic Spectrum
EMR is a dynamic form of energy that propagates as wave motion at a velocity of c = 3 x 10
10 cm/sec. The parameters that characterize a wave motion are wavelength (λ ),
frequency ( ν ) and velocity (c). The relationship between the above is
c = νλ.
Electromagnetic energy radiates in accordance with the basic wave theory. This theory
describes the EM energy as travelling in a harmonic sinusoidal fashion at the velocity of
light. Although many characteristics of EM energy are easily described by wave theory
another theory known as particle theory offers insight into how electromagnetic energy interacts with matter. It suggests that EMR is composed of many discrete units called
photons/quanta. The energy of quantum is
Q = hc/λ = hcν
Where Q is the energy of quantum, h = planck's constant
Table :Principal Divisions of the Electromagnetic Spectrum
Fig.1
8
Division Wavelength
Gamma rays ( 10-11 cm < λ < 10-8 cm)
X-rays ( 10-8 cm < λ < 10-6 cm)
Ultraviolet light ( 10-6 cm < λ < 4x10-5 cm)
Visible light (4 x 10-5 cm < λ < 7.6x10-5 cm)
Infra-red light (8 x 10-5 cm < λ < 10-1 cm)
Microwaves ( 10-1 cm < λ < 102 cm)
Radiowaves (102 cm) < λ
Electromagnetic Radiation Quantities: Nomenclature, Definition and Units Radiant Energy, Q: Radiant energy is the energy carried by EMR. Radiant energy causes
the detector element of the sensor to respond to EMR in some appropriate manner. Unit
of Radiant Energy Q is Joule.
Radiant Flux,φφφφ (Phi) is the time rate of flow of radiant energy. Unit of Radiant flux is Joule/second or watt (W).
Irradiance, E: Radiant flux intercepted by a plane surface per unit area of the surface.
The direction of the flux is not specified. It arrives at the surface from all directions
within a hemisphere over the surface. Unit of Irradiance E is W/m2
or Wm-2 (Watt per square metre).
Exitance (emittance) M; Radiant flux leaving a surface per unit area of the surface. The
flux may leave the surface in any or all directions within a hemisphere over the surface.
9
Solid Angle: is the cone angle subtended by the portion of a spherical surface at the center of the sphere. It is equal to the area of the spherical surface divided
by the square of the radius of the sphere. Its unit is steradian (sr) and it is denoted by
ω(omega).
Fig.3 Concept of Solid angle in angular measurement
Fig 2 Concept of Irradiance
10
Radiant IntensityI: of a point source in a given direction is the radiant flux per unit solid
angle leaving the source in that direction. Unit of Radiant Intensity is Watts/sr
Radiance L: is defined as the radiant flux per unit solid angle leaving an extended source
in a given direction per unit projected area of the source in that direction. The concept of
radiance is intended to correspond to the concept of brightness. The projected area in a
direction which makes an angle θ (Theta) with the normal to the surface of area A is A
cosθ
The relationship between radiant intensity and radiance is
I = L.A. cos θ Unit for Radiance is W m-2 sr-1
Lambertian Surface: The plane source or surface for which the radiance L does not
change as a function of angle of view is called Lambertian (perfectly diffused).
For a Lambertian surface:
a) The irradiance in the retinal image does not change with viewing angle for a
lambertian panel.
Fig5. Concept of Radiance
Fig4. Concept of Radiant Intensity
11
b) For a Lambertian surface, the exitance and radiance are related by
Exitance M = π * radiance L ( where π = 3.1415927)
Spectral Quantities:The electromagnetic spectrum consists of a continuance of
frequencies or wavelength each of which carry a share of the total radiant flux. The
manner in which the flux is distributed among the components of different wavelengths is
called the spectral distribution.
The spectral radiant flux, denoted by φλ (phi lambda), indicate the radiant flux per unit
wavelengths interval at wavelength , λ (lambda). Its unit is W/micron. Other spectral
quantities, namely, the spectral irradiance, spectral exitance, spectral radiant intensity
are defined in a similar way.
Hemispherical reflectance, transmittance and absorptance:
When radiant energy is incident on a surface, it is either returned back by reflection,
transmitted through the medium, or it is absorbed by the surface and transformed into
some other form of energy inside.
The hemispherical reflectance ρ ( roh ) is defined by the ratio of reflected exitance of the plane to the irradiance on the plane.
ρ (roh) = M reflected / irradiance E
The hemispherical transmittance τ ( tou ) is defined as he ratio of transmitted exitance,
leaving the opposite side of plane, to the irradiance on that plane.
τ (tou) = M transmitted / irradiance E
The hemispherical absorptance α ( alpha ) which denotes the fraction of incident energy
which is absorbed by the surface is given by
α (alpha) = 1 - τ (tou) - ρ (roh)
The corresponding spectral hemispherical quantities at discrete wavelengths of EMR are also defined in a similar way. For a perfect blackbody, reflectance and transmittance are
zero (0) and absorptance is unity (1). Whereas for perfectly white bodies, absorptance and
transmittance are zero (0) and reflectance is unity (1). The real bodies usually known as
gray bodies have these coefficients in between 0 and 1.
12
Thermal Emission of Radiation
All objects at all temperatures emit electromagnetic radiation at all wavelengths. The
thermal emission of radiation is due to the conversion of heat energy, which is the kinetic
energy of the random motion of the particles of the matter, into electromagnetic energy.
Thermal emission of radiation depends upon two parameters, as follows.
Absolute Temperature (T)
Emissivity (ε)
The total thermal radiation from a body increases with fourth power of T. The absolute
temperature is given by T = 273 + temperature in degrees centigrade in units of degrees
Kelvin (oK).
The emissivity factor (ε) is the characteristics of the material, a measure of its capability
to emit radiation due to thermal energy conversion. The emissivity of a substance is
related to its absorptive ability. Good absorbers are good radiators, whereas poor
absorbers are poor radiators. For an ideal thermal emitter, called a blackbody, ε = 1.
Radiation Principles
Planck's Law
The spectral exitance for a blackbody is given by Planck's Law.
M = C1 -5 [exp. (C2/ T) - 1]-1
C1 and C2 are constant, is the wavelength and T is the absolute temperature.
The spectral exitance of a blackbody is not the same at all wavelengths. The spectral
exitance is low for very short and very long wavelengths. It reaches the maximum value
for some wavelengths in between, depending on the temperature of the blackbody. A
blackbody at higher temperature emits more radiation than a blackbody at lower
temperature at all the wavelengths.
13
The Stephen-Boltzman Law
The total radiation emitted by a blackbody in the entire electromagnetic spectrum is
obtained by integrating the area under Planck's distribution curve. It is given by the
Stephen-Boltzman law.
M (blackbody) = σT4
Where σ = 5.67 x 10-8 W/m2 (oK)4
Wien's Displacement Law
The wavelength for which the spectral exitance has its maximum value is given λm (in
microns) = 2,898/T.
For a blackbody at T = 300oK, the peak in spectral emission occurs at λm = 10 µm. Hence,
objects at ambient temperature emit in the infrared region of the spectrum. As the
temperature of the object is raised, the peak emission shifts toward the shorter wavelength side. The radiation in the infrared portion of the spectrum is often referred
to as heat rays, as it produces the sensation of heat. At T = 6,000oK, about the
temperature of the sun, the peak emission occurs in the visible region of the spectrum (λm
= 0.5 µm). The production of light by heating and the shift of intensity toward shorter
wavelengths by increasing the temperature is illustrated by passing a current through a
blackened platinum wire in a dark room. As the magnitude of the current is increased, the
wire becomes hot, glows red initially, then the colour of the emitted radiation changes to
orange, then to yellow and ultimately to white.
Spectral Emissivity and Kirchoff's Law
Fig6. Spectral distribution of energy radiated from black bodies of various temperatures (note that spectral
radiant exitance Mλ is the energy emitted per unit wave length interval. Total radiant exitance M is given by
the area under the spectral radiant exitance curves
14
The three radiation laws mentioned above hold good for blackbody radiation only. All other
substances are characterized by their spectral emissivity (ε), defined as the ratio of
spectral exitance of the material to the spectral exitance of a blackbody at the same
temperature.
ε (λ ) = M (material, oK) / Mλ (blackbody oK)
Knowing the spectral emissivity of a body, its spectral exitance, total exitance and the
wavelength of peak emission can be determined.
Kirchoff's law states that the spectral emissivity of a material is equal to its spectral
absorptivity, i.e. ε (λ ) = α (λ ).
This implies that if a body is capable of emitting certain radiation, it will absorb that
radiation when exposed to it.
The emissivity characteristics of material can be summarized as follows.
♦ Blackbody: ε = 1 at all wavelengths.
♦ Grey body : 0 < ε < 1 (does not depend upon wavelength)
♦ Imperfect blackbody (perfect reflector): ε = 0
♦ All other bodies: ε = ε (λ ) is a function of wavelength.
The relationship between reflectance, absorptivity and transmittance was given by
ρ( λ ) + α (λ ) + τ( λ) = 1.
It can now be written as ρ( λ ) + ε (λ ) + τ( λ) = 1.
For opaque substances, τ( λ) = 0; hence, emissivity and reflectance are related by
ε (λ ) = 1 - ρ( λ )
Characteristics of Solar Radiant Energy
The sun is the strongest and most important source of radiant energy for remote sensing. The solar spectrum extends approximately from 0.3 µm to 3.0 µm. The maximum
irradiance occurs at 0.47 µm. the visible band from 0.40 µm to 0.76 µm receives about
46 per cent of the total solar energy.
15
The rate at which the total solar radiant energy flows across a unit area normal to the
direction of propagation located at a mean distance of the earth from the sun is called the
solar constant. the value of this constant is 1,353 w/m2 with an error of + 21 watts/m2.
the solar constant can be calculated from the blackbody temperature of the sun (T =
5,800oK) and the mean angular radius of the sun from the earth (4.6 x 10-3 radians).
Interaction of EMR with the Earth's Surface
Radiation from the sun, when incident upon the earth's surface, is either reflected by the
surface, transmitted into the surface or absorbed and emitted by the surface. The EMR, on interaction, experiences a number of changes in magnitude, direction, wavelength,
polarization and phase. These changes are detected by the remote sensor and enable the
interpreter to obtain useful information about the object of interest. The remotely
sensed data contain both spatial information (size, shape and orientation) and spectral
information (tone, color and spectral signature).
From the viewpoint of interaction mechanisms, the object (visible and infrared)
wavelengths from 0.3 µm to 16 µm can be divided into three regions. The spectral band
from 0.3 µm to 3 µm is known as the reflective region. In this band, the radiation sensed
by the sensor is that due to the sun, reflected by the earth's surface. The band
corresponding to the atmospheric window between 8 µm and 14 µm is known as the thermal
infrared band. The energy available in this band for remote sensing is due to thermal emission from the earth's surface. Both reflection and self-emission are important in the
intermediate band from 3 µm to 5.5 µm.
In the microwave region of the spectrum, the sensor is radar, which is an active sensor, as
it provides its own source of EMR. The EMR produced by the radar is transmitted to the
earth's surface and the EMR reflected (back scattered) from the surface is recorded and
analysed. The microwave region can also be monitored with passive sensors, called
microwave radiometers, which record the radiation emitted by the terrain in the microwave region.
Reflection
Of all the interactions in the reflective region, surface reflections are the most useful
and revealing in remote sensing applications. Reflection occurs when a ray of light is
redirected as it strikes a non transparent surface. The reflection intensity depends on the surface refractive index, absorption coefficient and the angles of incidence and
reflection.
Depending upon whether the surface is smooth or rough, the reflection is specular or
diffuse. Surface roughness is a function of the wavelength of incident radiation.
16
According to the Rayleigh criterion, if surface height variations are less than λ/ 8, the
surface is considered to be smooth, otherwise, it is rough. Hence, the wavelength of the incident EMR determines the surface roughness. For radiowaves, rocky terrain appears
smooth to incident EMR, whereas in the optical band, fine sand appears rough. If the
surface is flat and smooth, specular reflections occur, which follow the law of reflection
(the angle of incidence equals the angle of reflection). Specular reflections are
undesirable for remote sensing because they produce solar glint or glare. Water and
certain man-made features reflect specularly. A rough or diffuse surface reflects the
incident EMR in all directions independent of the angle of incidence.A perfectly diffuse
reflector has equal brightness when observed from any angle and is referred to as Lambertian surfaces. Among the natural features, which display essentially diffuse
reflection, are sand, tilled soils and certain types of vegetation. Mixed reflections occur
most frequently in nature. In this case, a reflecting surface returns radiant energy both
diffusely and specularly.
Transmission
Fig 7. Different types of scattering surfaces (a) Perfect specular reflector (b) Near perfect specular reflector (c) Lambertain
(d) Quasi Lambertian (e) Complex.
17
Transmission of radiation occurs when radiation passes through a substance without
significant attenuation. For a given thickness, or depth of a substance, the ability of a
medium to transmit energy is measured as transmittance (τ).
τ = transmitted radiation
incident radiation
Spectral Signature
Spectral reflectance, (ρ(λ)), is the ratio of reflected energy to incident energy as a
function of wavelength. Various materials of the earth's surface have different spectral
reflectance characteristics. Spectral reflectance is responsible for the color or tone in a
photographic image of an object. Trees appear green because they reflect more of the
green wavelength. The values of the spectral reflectances of objects averaged over
different, well-defined wavelength intervals comprise the spectral signature of the
objects or features by which they can be distinguished. To obtain the necessary ground
truth for the interpretation of multispectral imagery, the spectral characteristics of
various natural objects have been extensively measured and recorded. Figure 8 shows a
typical reflectance curves for three basic types of earth surface features, healthy
vegetation, dry bare soil(gray-brown and loamy) and clear lake water.
Interactions with the Atmosphere
Fig 8. Typical Spectral reflectance curves for vegetation, soil and water
18
The sun is the source of radiation, and electromagnetic radiation (EMR) from the
sun that is reflected by the earth and detected by the satellite or aircraft-borne sensor
must pass through the atmosphere twice, once on its journey from the sun to the earth
and once after being reflected by the surface of the earth back to the sensor.
Interactions of the direct solar radiation and reflected radiation from the target with the
atmospheric constituents interfere with the process of remote sensing and is called as "
Atmospheric Effects".
The interaction of EMR with the atmosphere is important to remote sensing for two main
reasons. First, information carried by EMR reflected/emitted by the earth's surface is modified while traversing through the atmosphere. Second, the interaction of EMR with
the atmosphere can be used to obtain useful information about the atmosphere itself.
The atmospheric constituents scatter and absorb the radiation modulating the radiation
reflected from the target by attenuating it, changing it's spatial distribution and
introducing into field of view radiation from sunlight scattered in the atmosphere and
some of the energy reflected from nearby ground area. Both scattering and absorption
vary in their effect from one part of the spectrum to the other.
The solar energy is subjected to modification by several physical process as it passes the
atmosphere viz.
1. Scattering
2. Absorption
3. Refraction
Atmospheric Scattering.
Scattering is the redirection of EMR by particles suspended in the atmosphere or by large
molecules of atmospheric gases. Scattering not only reduces the image contrast but also changes the spectral signature of ground objects as seen by the sensor. The amount of
scattering depends upon the size of the particles, their abundance, the wavelength of
radiation, depth of the atmosphere through which the energy is travelling and the
concentration of the particles. The concentration of particulate matter varies both in time
and over season. Thus the effects of scattering will be uneven spatially and will vary from
time to time.
Theoretically scattering can be divided into three categories depending upon the
wavelength of radiation being scattered and the size of the particles causing the
scattering. The three different types of scattering from particles of different sizes are
summarized below
19
Scattering
process
Wavelength Approximate
dependence particle
size
Kinds
of particles
Selective
I) Rayleigh λ -4 < 1 µm Air molecules
ii) Mie λ o to λ-4 0.1 to 10 µm Smoke, haze
Non-selective λo > 10 µm Dust, fog,
clouds
Raleigh Scattering:
Raleigh scattering appears when the radiation wavelength is much larger than the particle
size. Such particles could be small specks of dust or the larger molecules of the
atmospheric gases like nitrogen and oxygen. Since Raleigh scattering occurs in the absence
of atmospheric impurities it is some times known as " clear atmosphere" scattering.
Raleigh scattering causes the sky to appear blue. The scattering coefficient is
proportional to the inverse fourth power of wavelength.
For example scattering of visible light (.4 µm - .76 µm) by pure gas molecules in a clear
atmosphere. Radiation in shorter blue wavelengths is scattered towards the ground much more strongly than radiation in the red wavelength region. Due to Raleigh scattering multi
spectral data from the blue portion of the spectrum is of relatively limited usefulness.
It is observed that there is strong scattering in the forward as well as backward
directions. The strong backward scattering is responsible for the appearance of hot spots
in aerial photographs taken in hazy atmosphere by wide-angle cameras at times when the
direction of solar radiation falls within the field of view of the sensor.
Mie Scattering:
Mie scattering occurs when radiation wavelength is comparable to the size of the
scattering particles. It is caused by larger particles present in the atmosphere including
dust, pollen, and smoke and water droplets. Mie scattering tends to be greatest in the
lower atmosphere (0-5Km) where larger particles are more abundant. Depending upon the
particle size relative to the wavelength, Mie scattering may fall any where between λ−4 and λ−0. The incident light is scattered mainly in the forward direction .
In remote sensing Mie scattering usually manifest itself as a general deterioration of multi
spectral images across optical spectrum under conditions of heavy atmospheric haze.
Non Selective Scattering:
20
Non selective scattering usually occurs when the particle size is much larger than the
radiation wavelength . For radiation in near and visible spectrum such particles might be
larger water droplets or larger particles of airborne dust. Scattering does not depend on
the wavelength of radiation. This type of scattering usually occurs when the atmosphere is
heavily dust laden and results in a severe attenuation of the received data. There is a
uniform attenuation at all wavelength. The whitish appearance of the sky under haze
condition is due to non-selective scattering .
Occurrence of this scattering mechanism gives a clue to the existence of large particulate matter in the atmosphere above the scene of interest which itself is a useful data. The
effects of the Raleigh component of scattering can be eliminated by using minus blue
filters. However, the effects of heavy haze, when all the wavelengths are scattered
uniformly, cannot be eliminated by using haze filters. The effects of haze are less
pronounced in the thermal infrared region. Microwave radiation is completely immune to
haze and can even penetrate clouds.
Atmospheric Absorption
The gas molecules present in the atmosphere strongly absorb the EMR passing through the
atmosphere in certain spectral bands. Mainly three gases are responsible for most of
absorption of solar radiation viz. ozone, carbon dioxide and water vapor. Ozone absorbs the high energy, short wavelength portions of the ultraviolet spectrum (λ < 0.24µm)
thereby preventing the transmission of this radiation to the lower atmosphere. Carbon
dioxide is important in remote sensing as it effectively absorbs the radiation in mid and
far infrared regions of the spectrum. It strongly absorbs in the region from about 13-
17.5 µm, whereas two most important regions of water vapour absorption are in bands 5.5 -
7.0 µm and above 27 µm. Absorption relatively reduces the amount of light that reaches
our eye making the scene look relatively duller.
Atmospheric Windows
21
The general atmospheric transmittance across the whole spectrum of wavelengths is
shown in figure 9. The atmosphere selectively transmits energy of certain wavelengths.The spectral bands for which the atmosphere is relatively transparent are known as
atmospheric windows. Atmospheric windows are present in the visible part (.4 µm - .76
µm) and the infrared regions of the EM spectrum. In the visible
part transmission is mainly effected by ozone absorption and by molecular scattering. The
atmosphere is transparent again beyond about λ= 1mm, the region used for microwave
remote sensing.
Refraction
The phenomenon of refraction that is bending of light at the contact between two media
also occurs in the atmosphere as the light passes through the atmospheric layers of varied
clarity, humidity and temperature. These variations influence the density of atmospheric layers, which in turn causes the bending of light rays as they pass from one layer to
another. The most common phenomena are the mirage like apparitions sometimes visible in
the distance on hot summer days.
Effects of Atmospheric Haze Scattering in Remote Sensing
Fig 9. Atmospheric windows
22
The downward component of EMR that illuminates the ground features has two
components: direct sunlight and diffused skylight originating from atmospheric scattering.
The relative importance of the two depends on the solar zenith angle or the optical path
length of the atmosphere. Skylight is bluer than direct sunlight.
The upwelling radiation that reaches the remote sensor also has two components: the light
that interacts with the earth's surface and is reflected upwards, and the component
arising from the backward scattering of radiation from atmospheric particles.
Contrast Reduction The most serious effect of haze is contrast reduction. The downward component of
scattered light (skylight) reduces the brightness difference and hence the contrast
between sunlit and shaded areas of the ground. The upward component of scattered
radiation (sky radiance) increases the irradiance in the camera image plane and reduces
the image contrast.
E. ρ max. τΑ + π L
C = ---------------------------------
E. ρ min. τΑ + π L
E is irradiance due to sun and sky, ρ max. and ρ min. is reflectance of scene highlight and low light areas, τ is atmospheric transmittance and L is atmospheric radiance.