0% found this document useful (0 votes)
25 views

2 Intro RS

Uploaded by

prash_mce
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views

2 Intro RS

Uploaded by

prash_mce
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 97

RS and GIS

by
Dr. Prashanth J.

Civil Engineering Department


National Institute of Technology Silchar
Introduction to Remote Sensing and
Geographic Information System
Remote Sensing

Remote sensing is defined as the technique of obtaining information about objects


without being in physical contact.
Geographic Information System (GIS)

 A geographic information system (GIS) is a computer system for


capturing, storing, checking, and displaying data related to
positions on Earth’s surface.
 For example, GIS is used for Pinpointing new store locations,
Reporting power outages, analyzing crime patterns, Routing in
car navigation, Forecasting and predicting weather.
Is RS and GIS same?
Definition – Remote Sensing

 Science and art of obtaining information about an object, area or


phenomenon through an analysis of data acquired by a device
that is not in direct contact with the area, object or phenomenon
under investigation.
Lillesand, Thomas M. and Ralph W. Kiefer, “Remote Sensing and Image Interpretation” John
Wiley and Sons, Inc, 1979, p. 1

 American Society of Photogrammetry (1975) has defined


Remote Sensing as, “Remote sensing is detecting and measuring
electromagnetic energy emanating or reflected from distant
objects made of various materials, so that we can identify and
categorize these objects by class or type, substance and spatial
distribution.”
Basic principle of remote sensing

 Most remote sensing system utilizes the sun’s energy which


travel through the atmosphere are selectively scattered
observed depending upon the composition of the atmosphere
and wavelength involved.
 These radiations reaching earth interacts with the objects.
Some of these radiations are absorbed, reflected or emitted
back to the sensors and that recorded and processed in the
form of image which is then analyzed to extract the
information about the objects.
 Finally information extracted are applied in decision making
and solving particular problem.
Essential component of RS

 The Signal (from an object or phenomena)


 The Sensor (from a platform)
 The Sensing (Acquiring knowledge about object)
Stages of remote sensing
A D
 Energy Source or G
Illumination (A)
 Radiation and the
Atmosphere (B)
 Interaction with the B
Target (C) F
 Recording of Energy
by the Sensor (D)
 Transmission,
Reception, and
Processing (E)
 Interpretation and
Analysis (F) E
 Application (G) C
Types of Remote Sensing

 Passive Remote sensing systems measure this naturally available


energy. This can only take place when the sun is illuminating the
earth. Solar energy and radiant heat are examples.
 Active Remote sensing systems provide their own source of
energy for illumination. These sensors have the advantage of
obtaining data any time of day or season. Synthetic Aperture
Radar (SAR) is an example.

Passive and Active Remote Sensing


 Electromagnetic radiation is the energy source to illuminate the target.
 Electromagnetic wave consists of two fluctuating fields, they are an electrical
field (E) and a magnetic field (M). These two fluctuate at right angle to one
another, and both are perpendicular to the direction of propagation.
 The position of an electromagnetic wave within the electromagnetic spectrum
can be characterized by either its frequency of oscillation or its wavelength.
Relation between Wavelength and Frequency:
 Wavelength and frequency are related by the following formula:
c = λf
where: λ = wavelength (m)
f = frequency (cycles per second, Hz)
c = speed of light (3x108 m/s)

1. Medium frequency -
2. High frequency -
shorter wavelength
3. Low frequency -
Higher wavelength
Electromagnetic Spectrum
The particle theory suggests that electromagnetic radiation is composed of
many discrete units called photons or quanta. The energy of a quantum is given
as,
Q = hf
where
Q = energy of a quantum, joules (J)
h = Planck's constant, 6.626 X 10-34 J sec
f = frequency (c = fl)

Thus, we see that the energy of a quantum is inversely proportional to its


wavelength.

The longer the wavelength involved, the lower its energy content.

The low energy content of long wavelength radiation means that, in general,
systems operating at long wavelengths must "view" large areas of the earth at
any given time in order to obtain a detectable energy signal.
Interaction with the
Atmosphere
 Travel through some distance of the Earth's atmosphere.
 Particles and gases in the atmosphere can affect the incoming
light and radiation.
 These effects are caused by the mechanisms of scattering and
absorption.
 Scattering occurs when particles or large gas molecules
present in the atmosphere interact with and cause the
electromagnetic radiation to be redirected from its original
path.
 The amount of scattering depends on the wavelength of the
radiation, the abundance of particles or gases, and the
distance the radiation travels through the atmosphere.
There are three types of scattering which take place.
 Rayleigh scattering: occurs when particles are very small compared to the
wavelength of the radiation. These could be particles such as small specks
of dust or nitrogen and oxygen molecules. Rayleigh scattering causes
shorter wavelengths of energy to be scattered much more than longer
wavelengths. Rayleigh scattering is the dominant scattering mechanism in
the upper atmosphere.
 Mie scattering: occurs when the particles are just about the same size as the
wavelength of the radiation. Dust, pollen, smoke and water vapor are
common causes of Mie scattering which tends to affect longer wavelengths
than those affected by Rayleigh scattering.
Non-selective scattering: This occurs when the particles are
much larger than the wavelength of the radiation. Water
droplets and large dust particles can cause this type of
scattering.
Non-selective scattering gets its name from the fact that all
wavelengths are scattered about equally. This type of
scattering causes fog and clouds to appear white to our eyes
because blue, green, and red light are all scattered in
approximately equal quantities (blue + green + red light =
white light).
Absorption

 Absorption is the process by which radiant energy is absorbed and


converted into other forms of energy. An absorption band is a range of
wavelengths (or frequencies) in the electromagnetic spectrum within
which radiant energy is absorbed by substances such as water carbon
dioxide (CO2), oxygen (O2), ozone (O3), and nitrous oxide (N2O).
 Ozone, carbon dioxide, and water vapor are the three main atmospheric
constituents which absorb radiation.
Ozone: Ozone serves to absorb the harmful (to most living
things) ultraviolet radiation from the sun. Without this
protective layer in the atmosphere our skin would burn when
exposed to sunlight.

Carbon dioxide: Carbon dioxide referred to as a greenhouse gas.


This is because it tends to absorb radiation strongly in the far
infrared portion of the spectrum - that area associated with
thermal heating – which serves to trap this heat inside the
atmosphere.

Water vapor: Water vapor in the atmosphere absorbs much of


the incoming long wave infrared and shortwave microwave
radiation (between 22 m and 1 m). The presence of water vapor
in the lower atmosphere varies greatly from location to location
and at different times of the year.
Atmospheric Windows

 Those areas of the frequency spectrum which are not severely


influenced by atmospheric absorption and thus, are useful to remote
sensors, are called atmospheric windows.
 One important practical consequence of the interaction of
electromagnetic radiation with matter and of the detailed composition
of our atmosphere is that only light in certain wavelength regions can
penetrate the atmosphere well.
 Because gases absorb electromagnetic energy in very specific regions
of the spectrum, they influence where (in the spectrum) we can "look“
for remote sensing purposes.
The Atmospheric Window (Image from Penn State University)
Interaction with the target
 When electromagnetic energy is incident on any feature of earth's surface,
such as a water body, various fractions of energy get reflected, absorbed, and
transmitted.

• In remote sensing, the amount of reflected energy ER(l) is more important


than the absorbed and transmitted energies.
• The measure of how much electromagnetic radiation is reflected off a surface
is called its reflectance. The reflectance range lies between 0 and 1.

• A measure of 1.0 means that 100% of the incident radiation is reflected off
the surface, and a measure '0' means that 0% is reflected.

• The reflectance characteristics are quantified by "spectral reflectance”, r(l)


which is expressed as the following ratio:

• Because many remote sensing systems operate in the wavelength regions in


which reflected energy predominates, the reflectance properties of terrestrial
features are very important.
Spectral Signature

• Any Remotely sensed parameter, which directly or indirectly


characterizes the nature and condition of the object under
observation, is defined as the spectral signature.
Spectral Variation – Spectral
Reflectance curves

• A basic assumption made in remote sensing is that a specific


target has an individual and characteristic manner of interacting
with incident radiation.
• The manner of interaction is described by the spectral response
of the target.
• The spectral reflectance curves describe the spectral response
of a target in a particular wavelength region of electromagnetic
spectrum.
• Every object on the surface of the earth has its unique spectral
reflectance.
Spectral Reflectance Curves
Sensors

 Remote sensors are the instruments which detect various objects on the
earth’s surface by measuring electromagnetic energy reflected or emitted
from them.
 The sensors are mounted on the platforms.
 Different sensors record different wavelengths bands of electromagnetic
energy coming from the earth’s surface.
 As for example, an ordinary camera is the most familiar type of remote
sensor which captures visible portion of electromagnetic radiation.
Some Remote Sensors
Sensors - Types
■ Active Sensors provide their
 Passive sensors are those own energy source for
which detects naturally illumination of the target and
occurring energy. use sensors to measure how the
target interacts with the energy.
 Most often, the source of
energy is the sun. ■ Active sensors provide the
capability to obtain
 The Thematic Mapper, the measurements anytime,
primary sensor on the Landsat regardless of the time of day or
satellites, is a good example of season.
a passive sensor. ■ Doppler radar is an example of
an active remote sensing
technology.
Platforms

 For a sensor to collect and record electromagnetic energy


reflected or emitted from a target or feature space of interest,
it must be installed on a stable platform that may either be
ground based, air based or space based.
 According to Lillesand and Kiefer (2000), a platform is a
vehicle, from which a sensor can be operated.
 Platforms can vary from stepladders to satellites.
Classification

 There are different types of


platforms and based on its altitude
above earth surface, these may be
classified as:
 Ground based Platforms
 Air based Platforms
Balloons
Aircraft
 Space based Platforms
Ground based platform

 To study properties of a single plant or a


small patch of grass, ground based platform
is used.
 Ground based platforms are also used for
sensor calibration, quality control and for
the development of new sensors.
 Hand held devices, tripods, towers and Fig.: Crane, Ground based platform
cranes are few examples.
 For the field investigations, some of the
most popular platforms have been used are
‘cherry picker platform, portable masts and
towers. The cherry picker platforms can be
extended to approx. 15m.
Air based platform - Balloons

 The balloons have low acceleration, require no


power and exhibit low vibrations.
 There are three main types of balloon systems, viz.
free balloons, Tethered balloons and Powered
Balloons.
 It is used for probing the atmosphere and also
useful to test the instrument under development.
 Balloons as platforms are not very expensive like
aircrafts. Fig.: Balloon as platform

 In India, at present, Tata Institute of Fundamental


Research, Mumbai, has setup a National balloon
facility at Hyderabad.
Air based platform - Aircraft

 Aerial platforms are primarily stable wing


aircraft.
 Generally, aircraft are used to collect very
detailed images.
 It controls platform variables such as
altitude. Time of coverage can also be
controlled.
 However, it is expensive, less stable than
spacecraft and has motion blurring.
 Low Altitude Aircraft - operates below 30,000 ft.
 It is suitable for obtaining image data for small areas having large scale.
 They have single engine or light twin engine.

 High Altitude Aircraft - operates above 30,000 ft.


 It acquires imagery for large areas (smaller scale).
 High altitude aircraft includes jet aircraft with good rate of climb, maximum
speed, and high operating ceiling.
Space based platforms

 Remote sensing is also conducted from the space


shuttle or artificial satellites. Artificial satellites are
manmade objects, which revolve around another
object.
 Satellites are not affected by the earth’s atmosphere.
They move freely in their orbits around the earth.
 Satellite can cover much more land space than
planes and can monitor areas on a regular basis.
Fig.: Space based sensor
 One obvious advantage satellites have over aircraft mounted on a space shuttle
is global accessibility.
 It is through these platforms, we get enormous
amount of data and as a result remote sensing has
gained international popularity.
Orbit

 Satellite can cover the entire earth or any part of the earth at specified
intervals.
 The coverage mainly depends on the orbit of the satellite.
 Different types of orbits are required to achieve continuous
monitoring (meteorology), global mapping (land cover mapping), or
selective imaging (urban areas).
 The following orbit types are more common for remote sensing
missions.
 Polar Orbit
 Geostationary orbit
Polar Orbit

 Polar orbits are a type of low Earth orbit, as they are at


low altitudes between 200 to 1000 km.
 They are often used for applications such as
monitoring crops, forests and even global security.
 A polar orbit travels north-south over the poles and
takes approximately an hour and a half for a full
revolution.
 As the satellite is in orbit, the Earth is rotating beneath
it. As a result, a satellite can observe the entire Earth’s
surface (off-nadir) in the time span of 24 hours.
Sun-synchronous Orbit

 Sun-synchronous orbit (SSO) is a particular


kind of polar orbit.
 Satellites in SSO, travelling over the polar
regions, are synchronous with the Sun.
 This means they are synchronized to always
be in the same ‘fixed’ position relative to the
Sun.
 This means that the satellite always visits the
same spot at the same local time – for
example, passing the city of Paris every day
at noon exactly.
Geostationary Orbit

 Geostationary satellites are launched


into orbit in the same direction the
Earth is spinning i.e., West to East.
 The sweet spot is approximately 36,000
km above the Earth’s surface in high
Earth orbit.
 Weather, communication and global
positioning satellites are often in a
geostationary orbit.
Geosynchronous orbit

• At any inclination, a geosynchronous orbit synchronizes with the rotation of the


Earth.

• The time it takes for the Earth to rotate on its axis is 23 hours, 56 minutes and
4.09 seconds, which is the same as a satellite in a geosynchronous orbit.

• This makes geosynchronous satellites particularly useful for telecommunications


and other remote sensing applications.
Orbits of Remote Sensing
Characteristics of a Satellite Orbit

1. Orbital period

2. Altitude

3. Apogee and perigee

4. Inclination

5. Nadir, zenith and ground track

6. Swath

7. Sidelap and overlap


Characteristics of a Satellite
Orbit…

1. Orbital period
❖ Time taken by a satellite to complete one revolution
around the earth.

➢Spatial and temporal coverage of the imagery depends on


the orbital period.

➢It varies from around 100 minutes to 24 hours.


Characteristics of a Satellite
Orbit…

2. Altitude
❖ Altitude of a satellite is its height

with respect to the surface


immediately below it

➢ Low altitude ( altitude < 2000 km)

➢ Moderate altitude

➢ High altitude (altitude ~36000


km)
Characteristics of a Satellite
Orbit…

3. Apogee and perigee


❖ Apogee: Point in the orbit where the satellite is at maximum distance
from the Earth.

❖ Perigee: Point in the orbit where the satellite is nearest to the Earth.
Characteristics of a Satellite
Orbit…

4. Inclination

❖ Inclination of the orbit is measured


clockwise from the equator.

❖ Inclination of a remote sensing


satellite is typically 99o
Characteristics of a Satellite
Orbit…
5. Nadir, Zenith and Ground track
❖ Nadir : Point where radial line connecting the centre of the Earth and the
satellite intercepts the surface of the Earth.

➢ Point of shortest distance from the satellite to the Earth’s surface

❖ Zenith : Any point just opposite to the nadir, above the satellite

❖ Ground track: The circle on the Earth’sOff-Nadir


surface described by the nadir
point as the satellite revolves
Zenith
➢ Projection of the satellites orbit on the ground surface
Characteristics of a Satellite
Orbit…

6. Swath
❖ Swath of a satellite is the width of the area on the
surface of the Earth, which is imaged by the sensor
during a single pass.

❖ For example, swath width of the IRS-1C LISS-3


sensor is 141 km in the visible bands and 148 km in
Nadir
the shortwave infrared band.
Characteristics of a Satellite
Orbit…

7. Sidelap and Overlap


❖ Overlap: Common area on consecutive images
along the flight direction.
❖ Sidelap: Overlapping areas of the images taken
in two adjacent flight lines.
➢ Increase in sidelap helps to achieve more frequent
coverage of the areas in the higher latitudes
➢ For example, IRS-1C LISS-3 sensors create 7 km
overlap between two successive images.
➢ Sidelap of the IRS 1C LISS-3 sensor at the equator
is 23.5 km in the visible bands and 30km in the Zenith
shortwave infrared band.
Resolution Concept

 In general, the resolution is the


minimum distance between two
objects that can be distinguished in the
image.
 Objects closer than the resolution
appear as a single object in the image.
 In remote sensing, the term resolution
is used to represent the resolving
power, which includes not only the
capability to identify the presence of
two objects, but also their properties.
Types of resolutions

 Four types of resolutions are defined for the remote sensing


systems.

❖ Spatial resolution

❖ Spectral resolution

❖ Temporal resolution

❖ Radiometric resolution
Spatial Resolution

 Size of the smallest dimension on the Earth’s surface over which an


independent measurement can be made by the sensor.
➢ Expressed by the size of the pixel on the ground in meters
➢ Controlled by the Instantaneous Field of View (IFOV)

Coarse Spatial Resolution Fine Spatial Resolution


Instantaneous Field of View

 IFOV: Instantaneous Field of View


❖ Angular cone of visibility of the sensor

❖ Area on the Earth’s surface that is seen


at one particular moment of time
Instantaneous Field of View…

IFOV depends on
❖ Altitude of the sensor above the ground
level
❖ Viewing angle of the sensor

 A narrow viewing angle produces a


smaller IFOV.

 IFOV increases with altitude of the


sensor.
Ground Resolution Cell

 Size of the area viewed by the sensor on the ground at one particular
moment of time
 Depends on
❖ Altitude of the sensor
❖ IFOV of the sensor

 Obtained by multiplying the IFOV (in radians) by the distance from the
ground to the sensor.
 It is also referred as the spatial resolution of the remote sensing system.
Spatial Resolution and Feature
Identification

 For a feature to be detected, its size generally has to be equal to or larger


than the resolution cell.
 If more than one feature is present within ground resolution cell, the
signal response is a mixture of the signals from all the features
➢ From the average brightness recorded, any one particular feature among
them may not be detectable.
➢ Smaller features may sometimes be detectable if their reflectance dominates
within a particular resolution cell.
Spatial Resolution and Feature
Identification…
Example

Signature from the “house” dominates


for the cell and hence the entire cell is
classified as “house”

Shape and spatial extent of the feature


is better captured. However, some
discrepancy is present along the
boundary

Feature shape and the spatial


extent is more precisely identified
Classes of Spatial Resolution

 Low resolution systems


• Spatial resolution > 1000m
• MODIS, AVHRR
 Medium resolution systems
• Spatial resolution is 100m – 1000m
• IRS WiFS (188m), Landsat TM–Band 6 (120m), MODIS–Bands 1-7
(250-500m)
Classes of Spatial Resolution

o High resolution systems


•Spatial resolution approximately in the range 5m-100m
•Landsat ETM+ (30m), IRS LISS-III (23m MSS, 6m Panchromatic),
IRS AWiFS (56-70m), SPOT 5(2.5-5m Panchromatic)
o Very high resolution systems
•Spatial resolution less than 5m
•GeoEye (0.45m for Panchromatic, 1.65m for MSS), IKONOS (0.8-
1m Panchromatic), Quickbird (2.4-2.8 m)
Spatial Resolutions and Scale of
Applicability

(Courtesy: Morisette, 2002)


Scale of an Image

 Ratio of distance on an image or map, to actual ground distance.


 Maps or images with small "map-to-ground ratios" are referred to as small scale
(e.g. 1:100,000), and those with larger ratios (e.g. 1:5,000) are called large
scale.
 Example
What is the actual length of an object which is 1cm long in a map of scale
1:100,000?
Scale = 1:100,000
Object length in map = 1cm
Actual length on the ground = 1 cm x 100,000 = 100,000 cm = 1 km
Spectral Resolution

❖ Ability of a sensor to define fine wavelength intervals.


❖ Ability of a sensor to resolve the energy received in a spectral bandwidth to
characterize different constituents of earth surface.
❖ Depends on
➢ Spectral band width of the filter
➢ Sensitiveness of the detector
❖ The finer the spectral resolution, the narrower the wavelength range for a
particular channel or band.
Spectral Resolution…

 Finer the spectral resolution, the


narrower the wavelength range for
a particular channel or band

Nadi
r
Spectral Resolution…

❖ Most of the remote sensing systems are multi-spectral, using more than
one spectral band
❖ Spectral resolution of some of the remote sensing systems
• IRS LISS-III uses 4 bands: 0.52-0.59 (green), 0.62-0.68 (red), 0.77-0.86 (near IR)
and 1.55-1.70 (mid-IR).
• The Aqua/Terra MODIS instruments use 36 spectral bands, including three in the
visible spectrum.
• Recent development is the hyper-spectral sensors, which detect hundreds of very
narrow spectral bands.
Spectral Resolution and Feature
Identification

 Generally surface features can be better distinguished from multiple narrow


bands, than from a single wide band.

Spectral reflectance of A
Using the broad wavelength
and B are different in the
band 1, the features A and B
narrow bands 2 and 3, and
cannot be differentiated
hence can be differentiated
Spectral Resolution in Remote
Sensing

 Different features are identified from the image by comparing their


responses over different distinct spectral bands

 Broad classes, such as water and vegetation, can be easily separated


using very broad wavelength ranges like visible and near-infrared

 For more specific classes viz., vegetation type, rock classification etc.,
much finer wavelength ranges and hence finer spectral resolution are
required.
Difference in the spectral responses of an area
in different bands of Landsat TM image
Images produced from 8 bands of Landsat 7 ETM data of
Denver, CO.
Spectral Resolution…

Pan Image (Coarse) Landsat TM RGB = 543 (Fine)


Spectral Resolution…

Forest Fire (Yellowstone NP)


Radiometric Resolution

Radiometric resolution: Sensitivity of the sensor to the magnitude of


the electromagnetic energy
❖ How many grey levels are measured between pure black (no reflectance)
to pure white (maximum reflectance)

❖ The finer the radiometric resolution of a sensor the more sensitive it is in


detecting small differences in the energy

❖ The finer the radiometric resolution of a sensor the system can measure
more number of grey levels
Radiometric Resolution…

 Radiometric resolution is measured in Bits


➢ Each bit records an exponent of power 2
 Maximum number of brightness levels available depends on the number of bits
used in representing the recorded energy

Radiometric resolution and the corresponding brightness levels available

Radiometric Number of levels Example


resolution
Poor 1 bit 21 – 2 levels
resolution
7 bit 27 – 128 levels IRS 1A & 1B

High 8 bit 28 – 256 levels Landsat TM


resolution 11 bit 211 – 2048 levels NOAA-AVHRR
Radiometric Resolution and
Number of Grey Levels

 Tones in an image vary from black to


white

 Black → Digital Number = 0 → No


reflectance

 White → Digital Number is the


maximum

=1 for a 1-bit data

=255 for a 8-bit data


Radiometric Resolution and Level
of Information

 Finer radiometric resolution


➢ More the number of grey levels
➢ More details can be captured in the
image

 Finer radiometric resolution


➢ Increases the data storage
requirements
Temporal Resolution

Number of times an object is sampled


or
How often data are obtained for the same area
❖ The absolute temporal resolution of a remote sensing system to image the same
area at the same viewing angle a second time is equal to the repeat cycle of a
satellite.
❖ The repeat cycle of a near polar orbiting satellite is usually several days
Example: 24 days for IRS-1C and Resourcesat-2, 18 days for Landsat, 14 days for
IKONOS
❖ Actual temporal resolution ( or revisit period) of a sensor depends on
➢ The satellite/sensor capabilities
➢ Swath overlap and Latitude
Satellite Capabilities and
Temporal Resolution

 More frequent imaging is


possible by off-nadir viewing
capabilities

Example : IKONOS

Sensor characteristics: Pointable


optics

Repeat cycle : 14 days

Revisit period : 1-3 days


Importance of Temporal
Resolution

 Images at different time periods show the variation in the spectral characteristics of
different features over time

 Applications

➢ Land use/ land cover classification

➢ Temporal variation in land use / land cover

➢ Monitoring of a dynamic events like

 Cyclone

 Flood

 Volcano

 Earthquake
Flood Studies

 Satellite images before and after the flood event help to identify the aerial
extent of the flood during the progress and recession of a flood event
Landsat TM images of the Mississippi River taken during a normal period and during the great flood of 1993
Land Use/ Land Cover Classification:
MODIS data product for the Krishna River Basin

January, 2001 March, 2001 May, 2001

July, 2001 September, November,


2001 2001

FCC (RGB): 2,1,6 (NIR, red, MIR1) Krishna river basin, India
Trade-off Between Resolutions

Fine spatial resolution → small IFOV → less energy


• Difficult to detect fine energy differences → Poor radiometric resolution
• Poor spectral resolution

Narrow spectral bands →High spectral resolution → Less energy


• Difficult to detect fine energy differences → Poor radiometric resolution
• Poor spatial resolution

Wide spectral band → Poor spectral resolution→ more reflected energy


• Good spatial resolution
• Good radiometric resolution

These three types of resolutions must be balanced against the desired capabilities
and objectives of the sensor
Classification of Sensors

 On the Basis of Source of Energy Used


 Active Sensors
 Passive Sensors
 On the Basis of Function of Sensors
 Framing system
 Scanning system
 On the Basis of Technical Components of the System
 Optical-infrared sensor systems
 Microwave radar sensing systems
Framing and Scanning systems

 Sensors that instantaneously measure radiation coming from the entire scene
at once are called framing systems.
 The eye, a photo camera, and a TV vidicon (sensor in satellite) belong to
this group.
 The size of the scene that is framed is determined by optics and apertures in
the system that define the field of view, or FOV.

 If the scene is sensed point by point (equivalent to small areas within the
scene) along successive lines over a finite time, this mode of measurement
makes up a scanning system.
 Most non-camera sensors operating from moving platforms image the scene
by scanning.
On the Basis of Technical Components of
the System

 Optical-Infrared Sensors – 0.3 to 14 mm


❖Panchromatic Imaging System – 0.4 to 0.7 mm
❖Multispectral imaging system – 0.3 to 14 mm
➢ Thermal Infrared Remote Sensing – 3 to 14 mm
➢ Hyperspectral Imaging System – 0.4 to 2.5 mm

 Microwave Sensors – 1 mm to 1 m
Multispectral Imaging System

 A Multispectral scanner (MSS) simultaneously acquires images


in multiple bands of the EMR spectrum.
 Most commonly used scanning system in remote sensing.
Example:
 Landsat MSS- Used 4 bands: 0.5-0.6, 0.6-0.7, 0.7-0.8, 0.8-
1.1μm
 IRS LISS-III sensors uses 4 bands (0.52-0.59, 0.62-0.68, 0.77-
0.86, 1.55-1.70 μm) 3 in the visible and NIR regions., and 1 in
the MIR region of the EMR spectrum
Bands 4, 5, 6 and 7 from Lansdat1
MSS, and the standard FCC Source: http://www.fas.org/
Band 4 (0.5-0.6 μm) Band 5 (0.6-0.7 μm)

Band 6 (0.7-0.8 μm) Band 7 (0.8-1.1 μm)


Types of Multi Spectral Scanners

 MSS systems generate two-dimensional images of the terrain


using

❖Across-track (whiskbroom) scanning

❖Along-track (pushbroom) scanning


Across-Track Scanning

 Across-track scanner is also known as whisk-broom scanner

 Rotating or oscillating mirrors are used


 Scan lines at right angles to the flight
line are scanned successively as the
platform moves forward
 Continuously measures the energy
from one side to the other side of the
platform
Across-track scanning

 Incoming radiation is separated into


several thermal and non-thermal
wavelength components using a
dichroic grating and a prism.
 An array of electro-optical detectors,
each having peak spectral sensitivity
in a specific wavelength band, is
used to measure each wavelength
band separately
Along-track scanning

 Along-track scanner is also known as push-broom scanner.


 No scanning mirrors are used

 A linear array of detectors is used to


simultaneously record the energy received
from multiple ground resolution cells along
the scan line
 The array of detectors are pushed along the
flight direction to scan the successive scan
lines → push-broom scanner
 Size of the ground resolution cell is
determined by the IFOV of a single detector. (Source: http://stlab.iis.u-tokyo.ac.jp/)
Along-track scanning

 The linear array of detectors consists of numerous (more than 10,000) charged coupled
devices (CCDs).
 Each linear array is dedicated to record energy in a single band.
 Each detector element is dedicated to record the energy in a single column.
 The arrays of detectors are arranged in the focal plane of the scanner such that each
scan line is viewed simultaneously by all the arrays.
 Advantages of along-track scanning
➢ Linear array of detectors provides longer dwell time over each ground resolution
cell
➢ Higher signal strength
➢ Finer radiometric resolution
➢ Finer spatial and spectral resolution without impacting radiometric resolution
Thematic Mapper

 Thematic Mapper (TM) : Advanced MSS used by NASA in the Landsat 4 and 5 missions

➢ Higher spatial, spectral and radiometric accuracy

➢ 7 bands : More refined compared to the MSS

➢ Each band was designated for some potential application

Landsat TM bands and their principal applications (http://www.fas.org)


Band Spectral range μm Principal application

1 0.45-0.52 Coastal water mapping, soil-vegetation differentiation, deciduous-


coniferous differentiation
2 0.52-0.6 Green reflectance by healthy vegetation
3 0.63-0.69 Chlorophyl absorption for plant species differentiation
4 0.76-0.90 Biomass surveys, water body delineation
5 1.55-1.72 Vegetation moisture measurement, snow-cloud differentiation
6 10.4-12.5 Plant heat stress measurement, other thermal mapping
7 2.08-2.35 Hydrothermal mapping
Any Questions ???

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy