RS Exit
RS Exit
Course Content
Introduction to Remote Sensing
Principles of Remote sensing
Principles of electromagnetic energy
Sensor and platforms
Digital image processing
Areas of Competency for Exit Exam
As was noted in the previous section, the first requirement for remote
sensing is to have an energy source to illuminate the target (unless
the sensed energy is being emitted by the target). This energy is in the
form of electromagnetic radiation.
All electromagnetic radiation has fundamental properties and behaves
in predictable ways according to the basics of wave theory.
Electromagnetic radiation consists of an electrical field(E) which
varies in magnitude in a direction perpendicular to the direction in
which the radiation is traveling, and a magnetic field (M) oriented at
right angles to the electrical field. Both these fields travel at the speed
of light (c).
Two characteristics of electromagnetic radiation are particularly
important for understanding remote sensing.
These are the
wavelength and
frequency
Wavelength:
The wavelength is the length of one wave cycle.
Wavelength measured as the distance between successive wave crests.
Wavelength is usually represented by the Greek letter lambda (λ).
Wavelength is measured in meters (m), nano-meters, micro meters and centi-
meters.
Frequency:
Frequency refers to the number of cycles of a wave passing a
fixed point per unit of time.
Frequency is normally measured in hertz (Hz), equivalent to
one cycle per second, and various multiples of hertz.
Wavelength and frequency are related by the following formula:
c = λv
where,
λ = wavelength in meters
v = frequency (cycles of per second, in Hz)
c = speed of light
Therefore, the two (wavelength & frequency) are inversely related to
each other.
The shorter the wavelength, the higher the frequency.
The longer the wavelength, the lower the frequency.
Understanding the characteristics of electromagnetic radiation in
terms of their wavelength and frequency is crucial to understanding
the information to be extracted from remote sensing data.
The Electromagnetic Spectrum
Before radiation used for remote sensing reaches the Earth's surface it has to
travel through some distance of the Earth's atmosphere. Particles and gases in
the atmosphere can affect the incoming light and radiation. These effects are
caused by the mechanisms of scattering and absorption
Scattering
Atmospheric scattering is the unpredictable diffusion of radiation by particles in
the atmosphere.
1) Rayleigh scatter- Rayleigh scatter happens when the radiation interacts
with atmospheric molecules and other tiny particles that are much smaller in
diameter than the wavelength of interacting radiations.
2) Mie scatter- Mie scatter happens when atmospheric particle diameters
essentially equal the wavelength of the energy being sensed. Water vapor and
dust are major causes of Mie scatter.
3) Non-selective scatter- Non selective happens when the diameters of the
particles causing scatter are much larger than the wavelengths of the energy
being sensed.
Absorption
Absorption is the other main mechanism at work when
electromagnetic radiation interacts with the atmosphere.
In contrast to scattering, this phenomenon causes molecules in the
atmosphere to absorb energy at various wavelengths.
Ozone, carbon dioxide, and water vapor are the three
main atmospheric constituents which absorb radiation.
Ozone serves to absorb the harmful (to most living
things) ultraviolet radiation from the sun. Without this
protective layer in the atmosphere our skin would burn
when exposed to sunlight.
Those areas of the spectrum which are not severely influenced by
atmospheric absorption and thus, are useful to remote sensors, are called
atmospheric windows.
Radiation - Target Interactions
Spatial resolution refers to the smallest possible detail that the sensor
can capture.
A high spatial resolution is required when for example, detecting
separate houses.
It is commonly expressed in meters – a spatial resolution of 20 m will
result in pixel sizes, which are 20 x 20 m.
The fineness of detail visibility in an image.
Low resolution (coarse)
High resolution (fine)
2. Spectral Resolution
Geostationary Satellites
Geostationary satellites are the satellites which revolve round
the earth above the equator at the height of about 36,000 km .,
in the direction of earth's rotation.
They make one revolution in 24 hours, synchronous with the
earth's rotation
As a result, it appears stationary with respect to earth.
These platforms always cover a specific area and give
continuous coverage over the same area day and night.
Their coverage is limited to 70 N and 70 S latitudes and one
satellite can view one third globe.
These are mainly used for communication and weather
monitoring.
Some of these satellites are INSAT, METSAT and ERS series.
Sun-synchronous Satellites
Satellite sensors can work in two different ways: active and passive.
Active systems illuminate the area of interest and measure the reflected or
backscattered wavelength from the surface.
Active systems they are independent of weather conditions and can also
operate during nighttime
In contrast, passive sensors use the sun as an illumination source and
measure the energy naturally emitted from the Earth’s surface.
Most passive systems require a clear sky and daylight to operate.
Example of active remote sensing technology
RADAR
LiDAR
Laser fluoro- sensor
Example of passive remote sensing technology?
AVHRR
Landsat
MODIS
Multispectral Scanning
What is an image?
An image is defined as a two-dimensional function, F(x,y),
where x and y are spatial coordinates
In other words, an image can be defined by a two dimensional
array specifically arranged in rows and columns.
Digital Image is composed of a finite number of elements, each
of which elements have a particular value at a particular
location.
These elements are referred to as picture elements, image
elements ,and pixels
A Pixel is most widely used to denote the elements of a Digital
Image (DN)
Digital Image Processing is a software which is used in image
processing.
Types of an image
Binary Image– The binary image as its name suggests, contain only two
pixel elements i.e 0 & 1,where 0 refers to black and 1 refers to white.
This image is also known as Monochrome.
Black And White Image– The image which consist of only black and
white color is called black and white image.
8 Bit Color Format– It is the most famous image format.
It has 256 different shades of colors in it and commonly known as
Gray scale Image
In this format, 0 stands for Black, and 255 stands for white, and 127
stands for gray.
16 Bit Color Format– It is a color image format.
It has 65,536 different colors in it.
It is also known as High Color Format.
In this format the distribution of color is not as same as Gray scale
image.
Types of Image Processing
1. Rectification:
Rectification is the process of transforming the data from
one grid system into another grid system using a
geometric transformation.
A particular map projection is used for this process.
Reasons for rectifying image data:
the pixel grid of the image must be changed to fit
a map projection system;
Comparing pixels scene to scene applications
(change detection);
Identifying training samples according to map
coordinates prior to classification;
2. Registration
Registration is the process of making an image conform
to another image.
Images of one area that are collected from different
sources must be used together.
the pixel grids of each image must conform to the other
images in the database
Image Georeferencing: It refers to the process of assigning
map coordinates to image data.
Georeferencing, by itself, involves changing only the map
coordinate information in the image file.
Rectification involves georeferencing, since all map projection
systems are associated with map coordinates.
Image-to-image registration involves georeferencing
only if the reference image is already georeferenced.
3. Ortho-rectification
Ortho-rectification is a form of rectification that corrects for
terrain displacement and can be used if there is a DEM of
the study area.
Ortho-rectification process enables the use of DEM to also
take into account the topography.
Ortho-rectification is recommended:
in mountainous areas (or on aerial photographs of buildings), where a
high degree of accuracy is required.
In relatively flat areas, ortho-rectification is not mostly necessary
During Geometric Correction
Since the pixels of the new grid may not align with the pixels of the
original grid, the pixels must be resampled to conform to the new grid.
Resampling is the process of extrapolating data values for the pixels on
the new grid from the values of the source pixels.
Resampling Methods
The next step in the rectification and registration process is to create the
output file.
Since the grid of pixels in the source image rarely matches the grid for
the reference image, the pixels are resampled so that new data file values
for the output file can be calculated.
Types of resampling
1. Nearest Neighbor: uses the value of the closest pixel to assign to the
output pixel value.
2. Bilinear Interpolation: It uses the arithmetic mean of the four pixels
nearest the focal cell to calculate a new pixel value.
3. Cubic Convolution: it uses the data file values of sixteen pixels in a 4
× 4 window to calculate an output value with a cubic function.
Nearest Neighbor
Bilinear Interpolation
Basic elements of digital image interpretation