0% found this document useful (0 votes)
36 views

RS Exit

The document discusses remote sensing including its core concepts, principles, and processes. It covers electromagnetic radiation, interactions with the atmosphere and targets, and applications to vegetation analysis. Key areas covered include the electromagnetic spectrum, scattering, absorption, and how different wavelengths are used to extract information.

Uploaded by

Melkamu Amushe
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
36 views

RS Exit

The document discusses remote sensing including its core concepts, principles, and processes. It covers electromagnetic radiation, interactions with the atmosphere and targets, and applications to vegetation analysis. Key areas covered include the electromagnetic spectrum, scattering, absorption, and how different wavelengths are used to extract information.

Uploaded by

Melkamu Amushe
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 63

Remote Sensing

Course Content
Introduction to Remote Sensing
Principles of Remote sensing
Principles of electromagnetic energy
Sensor and platforms
Digital image processing
Areas of Competency for Exit Exam

General core competency


Understand the concept of Remote sensing and digital image
processing
Analyze satellite imagery to extract relevant information for
land
Specific core competency
 Identify the principles of remote sensing
 Analyze and Perform digital image processing
 Identify basic elements of digital image interpretation
 Explain how measurements and computations are made
using data from satellite images
 Synthesis satellite imagery to extract relevant information
for land administration
Introduction to Remote Sensing

What is Remote Sensing?


Remote sensing means sensing from a distance, whereby the
distance itself is not defined.
Generally, "Remote sensing is the science (and to some
extent, art) of acquiring information about the Earth's
surface without actually being in contact with it.
During data acquisition there is no any physical contact
between the sensors and target features.
Its main information sources are the measures and the
images obtained with the help of aerial and space platforms.
The use of electromagnetic radiation sensors to record
images of the target features , which can be interpreted to
yield useful information.
Process of Remote Sensing

 In much of remote sensing, the process involves an interaction


between incident radiation and the targets of interest. This is
exemplified by the use of imaging systems where the following seven
elements are involved.
1. Energy Source or Illumination (A) – the first requirement for
remote sensing is to have an energy source which illuminates or
provides electromagnetic energy to the target of interest.
2. Radiation and the Atmosphere (B) – as the energy travels from its
source to the target, it will come in contact with and interact with the
atmosphere it passes through. This interaction may take place a second
time as the energy travels from the target to the sensor.
3. Interaction with the Target (C) - once the energy makes its way to
the target through the atmosphere, it interacts with the target
depending on the properties of both the target and the radiation.
4.Recording of Energy by the Sensor (D) - after the energy has been
scattered by, or emitted from the target, we require a sensor (remote -
not in contact with the target) to collect and record the
electromagnetic radiation.
5. Transmission, Reception, and Processing (E) - the energy recorded
by the sensor has to be transmitted, often in electronic form, to a
receiving and processing station where the data are processed into an
image (hardcopy and/or digital).
6. Interpretation and Analysis (F) - the processed image is
interpreted, visually and/or digitally or electronically, to extract
information about the target which was illuminated.
7. Application (G) - the final element of the remote sensing process is
achieved when we apply the information we have been able to extract
from the imagery about the target in order to better understand it, reveal
some new information, or assist in solving a particular problem.
Electromagnetic Radiation

As was noted in the previous section, the first requirement for remote
sensing is to have an energy source to illuminate the target (unless
the sensed energy is being emitted by the target). This energy is in the
form of electromagnetic radiation.
All electromagnetic radiation has fundamental properties and behaves
in predictable ways according to the basics of wave theory.
Electromagnetic radiation consists of an electrical field(E) which
varies in magnitude in a direction perpendicular to the direction in
which the radiation is traveling, and a magnetic field (M) oriented at
right angles to the electrical field. Both these fields travel at the speed
of light (c).
Two characteristics of electromagnetic radiation are particularly
important for understanding remote sensing.
These are the
wavelength and
frequency
Wavelength:
 The wavelength is the length of one wave cycle.
 Wavelength measured as the distance between successive wave crests.
 Wavelength is usually represented by the Greek letter lambda (λ).
 Wavelength is measured in meters (m), nano-meters, micro meters and centi-
meters.
Frequency:
Frequency refers to the number of cycles of a wave passing a
fixed point per unit of time.
Frequency is normally measured in hertz (Hz), equivalent to
one cycle per second, and various multiples of hertz.
Wavelength and frequency are related by the following formula:
c = λv
where,
λ = wavelength in meters
v = frequency (cycles of per second, in Hz)
c = speed of light
 Therefore, the two (wavelength & frequency) are inversely related to
each other.
 The shorter the wavelength, the higher the frequency.
 The longer the wavelength, the lower the frequency.
 Understanding the characteristics of electromagnetic radiation in
terms of their wavelength and frequency is crucial to understanding
the information to be extracted from remote sensing data.
The Electromagnetic Spectrum

 The electromagnetic spectrum ranges from the shorter wavelengths


(including gamma and x-rays) to the longer wavelengths (including
microwaves and broadcast radio waves).
 There are several regions of the electromagnetic spectrum which are
useful for remote sensing.
UV (Ultra-Voilet) Spectrum
 For practical purpose, significance of RS begins from UV region
 It is a zone of short wavelength radiation.
 This zone lies between the X-rays and the limit of human vision, that
is visible region.
Visible Spectrum
 This region has a great significance/importance in RS. All
the analysis usually done by this region till now.
 Although this region constitutes a very small portion of spectrum.
 Today we know these segments as Blue (0.4-0.5 micro-m),
Green (0.5-0.6) and Red (0.6-0.7 micro-m).
 All the visible colors are mixture of these colors.
Ex: Rainbow
Infrared Spectrum
Wavelength of this spectrum longer than the red portion of visible
spectrum.
This segment is 40 times wider than the visible spectrum.
It extents from 0.72 to 15 micro-meters.
Divided into Near, Mid and Far Infrared regions.
RS in this region can use films, filters and camera.
Microwave Energy
The longest wavelength from 1mm to 1m.
The longer wavelengths of the Microwave region merge into the
radio wavelengths used for commercial broadcasts.
Interactions with the Atmosphere

Before radiation used for remote sensing reaches the Earth's surface it has to
travel through some distance of the Earth's atmosphere. Particles and gases in
the atmosphere can affect the incoming light and radiation. These effects are
caused by the mechanisms of scattering and absorption
Scattering
Atmospheric scattering is the unpredictable diffusion of radiation by particles in
the atmosphere.
1) Rayleigh scatter- Rayleigh scatter happens when the radiation interacts
with atmospheric molecules and other tiny particles that are much smaller in
diameter than the wavelength of interacting radiations.
2) Mie scatter- Mie scatter happens when atmospheric particle diameters
essentially equal the wavelength of the energy being sensed. Water vapor and
dust are major causes of Mie scatter.
3) Non-selective scatter- Non selective happens when the diameters of the
particles causing scatter are much larger than the wavelengths of the energy
being sensed.
Absorption
Absorption is the other main mechanism at work when
electromagnetic radiation interacts with the atmosphere.
 In contrast to scattering, this phenomenon causes molecules in the
atmosphere to absorb energy at various wavelengths.
 Ozone, carbon dioxide, and water vapor are the three
main atmospheric constituents which absorb radiation.
Ozone serves to absorb the harmful (to most living
things) ultraviolet radiation from the sun. Without this
protective layer in the atmosphere our skin would burn
when exposed to sunlight.
Those areas of the spectrum which are not severely influenced by
atmospheric absorption and thus, are useful to remote sensors, are called
atmospheric windows.
Radiation - Target Interactions

Radiation that is not absorbed or scattered in the atmosphere can reach


and interact with the Earth's surface.
There are three (3) forms of interaction that can take place when
energy strikes, or is incident (I) upon the surface.
These are: absorption (A); transmission (T); and reflection (R).
 The total incident energy will interact with the surface in one or more
of these three ways. The proportions of each will depend on the
wavelength of the energy and the material and condition of the
feature.
Absorption (A) occurs when radiation (energy) is absorbed into the
target.
Transmission (T) occurs when radiation passes through a target.
Reflection (R) occurs when radiation "bounces" off the target and
is redirected. In remote sensing, we are most interested in measuring
the radiation reflected from targets.
Vegetation

 Chlorophyll strongly absorbs energy in the wavelength bands


centered at about 0.45 and 0.67 μm.
 Hence, our eyes perceive healthy vegetation as green in colour
because of the very high absorption of blue and red energy by
plant leaves and the very high reflection of green energy.
 Leaves appear "greenest" to us in the summer, when
chlorophyll content is at its maximum.
 In fact, measuring and monitoring the near-IR reflectance is
one way that scientists can determine how healthy (or
unhealthy) vegetation may be.

Passive and active Remote Sensing

Passive Remote Sensing


 The sun provides a very convenient source of energy for
remote sensing.
 The sun's energy is either reflected, as it is for visible
wavelengths, or absorbed and then reemitted, as it is for
thermal infrared wavelengths.
 For all reflected energy, this can only take place during
the time when the sun is illuminating the Earth.
 There is no reflected energy available from the sun at
night.
 Remote sensing systems which measure energy that is
naturally available are called passive sensors.
 Passive sensors can only be used to detect energy when
the naturally occurring energy is available.
 For all reflected energy, this can only take place during the
time when the sun is illuminating the Earth.
Active Remote Sensing
 Active sensors, on the other hand, provide their own energy source
for illumination.
 The sensor emits radiation which is directed toward the target to be
investigated.
 Advantages for active sensors include the ability to obtain
measurements anytime, regardless of the time of day or season.
 Active sensors can be used for examining wavelengths that are not
sufficiently provided by the sun, such as microwaves, or to better
control the way a target is Illuminated.
Sensors

Sensors can be on board of airplanes or on board of satellites,


measuring the electromagnetic radiation at specific ranges (usually
called bands).
As a result, the measures are quantized and converted into a digital
image, where each picture elements (i.e. pixel) has a discrete value in
units of Digital Number (DN).
The resulting images have different characteristics (resolutions)
depending on the sensor.
All satellite sensors have their strengths and weaknesses and are
helpful for different problems.
There are four types of resolutions:
1. Spatial resolution
2. Spectral resolution
3. Radiometric resolution
4. Temporal resolution
1. Spatial Resolution

Spatial resolution refers to the smallest possible detail that the sensor
can capture.
A high spatial resolution is required when for example, detecting
separate houses.
It is commonly expressed in meters – a spatial resolution of 20 m will
result in pixel sizes, which are 20 x 20 m.
The fineness of detail visibility in an image.
Low resolution (coarse)
High resolution (fine)
2. Spectral Resolution

 Dimension and number of specific wavelength intervals in the


electromagnetic spectrum to which a sensor is sensitive
 Narrow bandwidths in certain regions of the electromagnetic
spectrum allow the discrimination of various features more easily
 e.g. Recorded in: Blue, Green, Red, Near-infrared, Thermal
infrared, and Microwave (radar).
 High spectral resolution describes a narrow wavelength range
 For example, multi-spectral satellite systems such as Landsat
detect several discrete bands (3 to 10) at different wavelength
intervals.
 Hyperspectral instruments can consist of hundreds or thousands
of narrow bands.
 This high spectral resolution is useful when a fine discrimination,
for example, between minerals or vegetation species, is needed
3.Radiometric Resolution

Radiometric resolution, or radiometric sensitivity refers to


the number of digital levels used to express the data
collected by the sensor.
The greater the number of levels, the greater the detail of
the information.
The radiometric resolution of an imaging system describes
its ability to discriminate very slight differences in energy.
The finer the radiometric resolution of a sensor, the more
sensitive it is to detecting small differences in reflected or
emitted energy
The higher the bit value of an image, the more variation in
reflection is captured, and the larger the image
4. Temporal Resolution

 Temporal resolution describes how often the sensor will revisit


the same object and is often reported in days.
 For example, the Landsat satellite will pass by the location
every 16 days in its orbit,
 while the SPOT satellite can revisit a location every 1 to 4
days.
 For example, a high temporal resolution is needed when
mapping the impacts of an extreme weather event,
 for example, how flooding impacts a region over several days
 Monitoring changes in vegetation over the summer require a
lower temporal resolution
TYPES OF PLATFORMS

 The base, on which remote sensors are placed to acquire information


about the Earth's surface, is called platform
 Platforms can be stationary like a tripod (for field observation) and
stationary balloons or mobile like aircrafts and spacecraft's
 The types of platforms depend upon the needs as well as constraints
of the observation mission.
 There are three main types of platforms, namely 1) Ground borne,
2) Air borne and 3) Space borne
1.GROUND BORNE  PLATFORMS
These platforms are used on the surface of the Earth
Cherry-arm configuration of Remote Sensing van and tripod are the
two commonly used ground borne platforms
They have the capability of viewing the object from different angles
and are mainly used for collecting the ground truth or for laboratory
simulation studies.
2.AIR BORNE PLATFORMS
 These platforms are placed within the atmosphere of the Earth and
can be further classified into balloons and aircrafts.
 Balloons: Balloons as platforms are not very expensive like aircrafts. 
 Aircrafts: Aircrafts are commonly used as remote-sensing for
obtaining Aerial Photographs.
3.SPACE  BORNE PLATFORMS
 Platforms in space, i.e. satellites are not affected by the earth's
atmosphere.
 The platforms move freely in their orbits around the earth.
 The entire earth or any part of the earth can be covered at specified
intervals.
 The coverage mainly depends on the orbit of the satellite.
 According to the orbital mode, there are two types of satellites-
Geostationary or Earth synchronous and sun-synchronous.
GROUND BORNE  PLATFORMS AIR BORNE  PLATFORMS

SPACE BORNE  PLATFORMS


Satellite Characteristics: Orbits and Swaths

Geostationary Satellites
Geostationary satellites are the satellites which revolve round
the earth above the equator at the height of about 36,000 km .,
 in the direction of earth's rotation.
They make one revolution in 24 hours, synchronous with the
earth's rotation
As a result, it appears stationary with respect to earth.
These platforms always cover a specific area and give
continuous coverage over the same area day and night.
Their coverage is limited to 70 N and 70 S latitudes and one
satellite can view one third globe.
 These are mainly used for communication and weather
monitoring.
Some of these satellites are INSAT, METSAT and ERS series.
Sun-synchronous Satellites

 Sun-synchronous satellites are the satellites which revolved round the


earth in north-south direction (pole to pole) at the height of about 700
to1500 km.
 They pass over places on earth having the same latitude twice in each
orbit at the same local sun-time,
 Through these satellites, the entire globe is covered on regular basis
and gives repetitive coverage on periodic basis.
 All the remote sensing resources satellites grouped in this category
 Few of the satellites are: LANDSAT, IRS, SPOT series and NOAA,
SKYLAB, SPACE SHUTTLE etc.
Active and Passive Satellites

 Satellite sensors can work in two different ways: active and passive.
 Active systems illuminate the area of interest and measure the reflected or
backscattered wavelength from the surface.
 Active systems they are independent of weather conditions and can also
operate during nighttime
 In contrast, passive sensors use the sun as an illumination source and
measure the energy naturally emitted from the Earth’s surface.
 Most passive systems require a clear sky and daylight to operate.
Example of active remote sensing technology
 RADAR
 LiDAR
 Laser fluoro- sensor
Example of passive remote sensing technology?
 AVHRR
 Landsat
 MODIS
Multispectral Scanning

Many electronic remote sensors acquire data using scanning


systems, which employ a sensor with a narrow field of view
(IFOV) that sweeps over the terrain to build up and produce a
two-dimensional image of the surface
Scanning systems can be used on both aircraft and satellite
platforms and have essentially the same operating principles
A scanning system used to collect data over a variety of
different wavelength ranges is called a multispectral scanner
(MSS), and is the most commonly used scanning system
There are two main modes or methods of scanning employed to
acquire multispectral image data - across-track scanning, and
along-track scanning.
Across-track scanners scan the Earth in a series of lines. The lines are
oriented perpendicular to the direction of motion of the sensor platform
(i.e. across the swath)
Along-track scanners also use the forward motion of the platform to
record successive scan lines and build up a two-dimensional image,
perpendicular to the flight direction.
A= rotating mirror
B= detectors
C=field of view
D= ground resolution
E = angular field of view
F= swath

Across-track scanners Along-track scanners


Digital Image Processing

What is an image?
 An image is defined as a two-dimensional function, F(x,y),
where x and y are spatial coordinates
 In other words, an image can be defined by a two dimensional
array specifically arranged in rows and columns. 
 Digital Image is composed of a finite number of elements, each
of which elements have a particular value at a particular
location.
 These elements are referred to as picture elements, image
elements ,and pixels
 A Pixel is most widely used to denote the elements of a Digital
Image (DN)
 Digital Image Processing is a software which is used in image
processing.
Types of an image

Binary Image– The binary image as its name suggests, contain only two
pixel elements i.e 0 & 1,where 0 refers to black and 1 refers to white.
This image is also known as Monochrome.
Black And White Image– The image which consist of only black and
white color is called black and white image.
8 Bit Color Format– It is the most famous image format.
It has 256 different shades of colors in it and commonly known as
Gray scale Image
In this format, 0 stands for Black, and 255 stands for white, and 127
stands for gray.
16 Bit Color Format– It is a color image format.
 It has 65,536 different colors in it.
It is also known as High Color Format.
 In this format the distribution of color is not as same as Gray scale
image.
Types of Image Processing

Image processing includes the two types of methods:


Analogue Image Processing: Generally, analogue image
processing is used for hard copies like photographs and
printouts. Image analysts use various facets of interpretation
while using these visual techniques.
Digital image processing: Digital image processing methods
help in manipulating and analyzing digital images. In addition
to improving and encoding images, digital image processing
allows users to extract useful information and save them in
various formats.
Digital Image Processing

 Digital Image Processing is a software which is used in


image processing.
 For example: computer graphics, signals, photography,
camera mechanism, pixels, etc.
 Digital image processing is the use of algorithms and
mathematical models to process and analyze digital images.
 The goal of digital image processing is to enhance the
quality of images, extract meaningful information from
images, and automate image-based tasks
Steps of Digital Image Processing

The basic steps involved in digital image processing are:


1.Image acquisition: This involves capturing an image using a
digital camera or importing an existing image into a computer.
2.Image enhancement: Once the image is acquired, it must be
processed.
This involves improving the visual quality of an image, such as
increasing contrast, reducing noise, and removing artifacts
To extract some hidden details from an image and is subjective. 
3.Image restoration: This involves removing degradation from an
image, such as blurring, noise, and distortion.
4.Image segmentation: This involves dividing an image into
regions or segments, each of which corresponds to a specific object
or feature in the image.
5.Image representation and description: This involves
representing an image in a way that can be analyzed and
manipulated by a computer, and describing the features of an
image in a compact and meaningful way.
6.Image analysis: This involves using algorithms and
mathematical models to extract information from an image,
such as recognizing objects, detecting patterns, and quantifying
features.
7.Image synthesis and compression: This involves
generating new images or compressing existing images to
reduce storage and transmission requirements.
8.Digital image processing is widely used in a variety of
applications, including medical imaging, remote sensing,
computer vision, and multimedia.
Image Preprocessing: Correcting Data Anomalies
Some image anomalies are inherent to:
 Certain sensors;
 Atmospheric Turbulences and;
 The curvature and rotation of the Earth produces geometric
distortions in the image data.
 These distortions can be corrected by applying mathematical
formulas/computer algorithms under preprocessing stage.
 In general, images taken by sensors located on satellites
contains:
1. Radiometric Errors: resulted from sensors multifunction's and
atmospheric disturbance
2. Geometric Errors: Relative motion between the satellite
and the earth, sensor exploration, Earth’s Curvature and
platform variations
Pre-processing
Every “raw” remotely sensed image contains errors.
Systematic errors and
Random errors
1. Systematic Errors (predictable in nature)
These errors are typically introduced by imperfections in the instruments. They are
mostly resulted:
Mirror-Scan Velocity Variance
Panoramic Distortion
Platform Velocity
Earth Rotation
Perspective of the sensor
2. Random Errors (Unsystematic)
Random Errors can be introduced by:
the measuring instrument,
the observing procedures or
the environment in which the measurement sensors operated.
Errors can be compensated by careful calibration and rectification
procedures during image preprocessing phase.
Correcting such errors is termed pre-processing.
The boundary line between pre-processing and processing is often fuzzy.

Satellite Image Processing Hierarchy: Preprocessed


Images Tasks
1. Radiometric Calibration: Convert digital levels to radiance values or
brightness temperature values.
2. Atmospheric correction: Take into account the contribution of
atmospheric radiation reaching the sensor.
3. Geometric correction: Correct distortions in the images received related
to curvature and rotation of the Earth, sensor exploration and variations of
the platform.
4. Detection of clouds: Mask correctly cloudy pixels to ensure that the
geophysical parameters obtained are representative of the Earth surface.
There are two types of data correction:
Radiometric corrections
Geometric corrections

1. Radiometric Correction (calibration)


Radiometric Modeling: Convert DN to radiance values: it is necessary to obtain
geophysical parameters or to compare images from different sensors
It is the removal of sensor or atmospheric 'noise', to more accurately represent
ground conditions.
It can improve image ‘dependability’ by correcting pixels

Data Anomalies in line with Radiometric cases


i. Striping Image Noise:
Striping/banding occurs if a detector goes out of
adjustment.
Various algorithms have been advanced to fix this problem:
simple along-line convolution,
high-pass filtering, and
forward and reverse principal component transformations
ii. Atmospheric Effects
It is often important to remove atmospheric effects,
especially for:
 scene matching and
 change detection analysis
The radiometric calibration includes the following steps to
eliminate atmospheric effects :
 DN to at-sensor radiance conversion;
 At-sensor radiance to at-surface radiance conversion;
 Solar and topographic correction;
 Reflectance estimation
2. Geometric correction
Geometric corrections are made to correct the inaccuracy between: the
location coordinates of the image data, and the actual location
coordinates on the ground.
It is essential to have the exact location of any pixel, in order to
compare images-multi-temporal or multi-sensor analysis- or to
validate satellite data with in-situ measurements.
Geometric correction is applied to raw sensor data to correct
these errors.
Reasons for these geometric imperfections in the sensor image, including:
 the perspective of the sensor optics;
 the motion of the scanning system;
 the motion of the platform;
 the platform altitude, attitude, and velocity;
 the terrain relief; and,
 the curvature and rotation of the Earth.
Why we need Geometric correction?
 To provide georeferenced imageries
 To compare/overlay multiple images;
 To merge with map layers;
 To mosaic (to merge) adjacent images
 To locate relevant points;
 To overlap temporal sequences of images on the same area
taken by different sensors, from different vantage points.
 To overlay an image with a GIS vector data for joint use;
Geo-correction Approaches

1. Rectification:
Rectification is the process of transforming the data from
one grid system into another grid system using a
geometric transformation.
A particular map projection is used for this process.
Reasons for rectifying image data:
the pixel grid of the image must be changed to fit
a map projection system;
Comparing pixels scene to scene applications
(change detection);
Identifying training samples according to map
coordinates prior to classification;
2. Registration
Registration is the process of making an image conform
to another image.
Images of one area that are collected from different
sources must be used together.
the pixel grids of each image must conform to the other
images in the database
Image Georeferencing: It refers to the process of assigning
map coordinates to image data.
Georeferencing, by itself, involves changing only the map
coordinate information in the image file.
Rectification involves georeferencing, since all map projection
systems are associated with map coordinates.
Image-to-image registration involves georeferencing
only if the reference image is already georeferenced.
3. Ortho-rectification
Ortho-rectification is a form of rectification that corrects for
terrain displacement and can be used if there is a DEM of
the study area.
Ortho-rectification process enables the use of DEM to also
take into account the topography.

Ortho-rectification is recommended:
in mountainous areas (or on aerial photographs of buildings), where a
high degree of accuracy is required.
In relatively flat areas, ortho-rectification is not mostly necessary
During Geometric Correction
Since the pixels of the new grid may not align with the pixels of the
original grid, the pixels must be resampled to conform to the new grid.
Resampling is the process of extrapolating data values for the pixels on
the new grid from the values of the source pixels.
Resampling Methods
The next step in the rectification and registration process is to create the
output file.
Since the grid of pixels in the source image rarely matches the grid for
the reference image, the pixels are resampled so that new data file values
for the output file can be calculated.
Types of resampling
1. Nearest Neighbor: uses the value of the closest pixel to assign to the
output pixel value.
2. Bilinear Interpolation: It uses the arithmetic mean of the four pixels
nearest the focal cell to calculate a new pixel value.
3. Cubic Convolution: it uses the data file values of sixteen pixels in a 4
× 4 window to calculate an output value with a cubic function.
Nearest Neighbor

Bilinear Interpolation
Basic elements of digital image interpretation

 Interpretation is the processes of detection, identification,


description and assessment of significant of an object and
pattern imaged.
 The interpretation of satellite imagery and aerial
photographs involves the study of various basic characters
of an object with reference to spectral bands which is useful
in visual analysis.
 The method of interpretation may be either visual or digital
or combination of both
 Both the interpretation techniques have merits and demerits
 The basic elements are shape, size, pattern, tone, texture,
shadows, location, association and resolution.
Shape: The external form, outline or configuration of the object and
helps in identifying a feature.
A round or oval shape feature could be a stadium.
 A straight line with very few turns could be a railway track.
 An assimilation of various elements of recognition will help to
ascertain/identify the feature.
 A natural water body is more likely to have irregular shape.
Size: Size of a feature in relation to the nearby feature plays an
important role in successful identification of a feature.
Shadow: Shadow of a feature helps in delineating boundary of
the feature. A big object would cast a large shadow as
compared to smaller ones.
Color/Tone: Color or tone of an object is the relative
brightness/darkness of an object.
 A dark blue to black colored huge feature could be water.
 Variation in the tone could be attributed to reflectance, remittance,
absorption or transmission of the feature.
Texture: Texture is the frequency of the tonal changes of the surface.
 This element is quiet important in case of agriculture and forestry.
 A group of trees may have a specific texture and that will help to
distinguish between a species of tree.
 A rocky mountain will have a different texture than the mountain
with lots of plantation.
Pattern: Spatial arrangement of features in a particular format
is pattern.
 A river will have number of tributaries and on the basis of
arrangement of these tributaries you can identify them.
 A city area with well defined rectangular plots could help
you to identify the sectors in the city.
Association: The relationship between different features at the
area of interest is the association.
 A long canal/pipeline along the wide spread area of
agricultural fields.
 If there is a water body with well defined edges then it
could be a man made water body, like dam.

Site: A site is the presence of a feature at a particular


geographical location.
Image Layer Selection and Stacking: Stack multiple
(usually single band) images as bands/layers into a single
output multi-band image file.
Subset an Image: many images used in IMAGINE cover a
large area, while the actual area being studied can only cover a
small portion of the image.
To save on disk space and processing time, you can
make new images out of a subset of the entire data set.
Mosaicing Two or more Scenes: In reality the images to be
mosaic have to be of the same acquisition data and they have
to be adjacent.
Image Classification

 Image classification is a procedure to automatically


categorize all pixels in an image of a terrain into land cover
classes.
 Normally, multispectral data are used to perform the
classification of the spectral pattern present within the data
for each pixel is used as the numerical basis for
categorization.
 Two major types of classification techniques: Supervised
and Unsupervised
1. Unsupervised Classification
 In unsupervised classification, it first groups pixels into
“clusters” based on their properties.
 Then, you classify each cluster with a land cover class.
 Overall, unsupervised classification is the most basic
technique.
 Because you don’t need samples for unsupervised
classification, it’s an easy way to segment and understand
an image.
2. Supervised Classification
 In  supervised  classification,  you  select  representative
 samples  for  each  land  cover  class.
  The software then uses these “training sites” and applies
them to the entire image.
The three basic steps for supervised classification are:
Select training areas
Generate signature file
Classify

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy