Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
1
Elements Involved in Remote Sensing
1. Energy Source or
Illumination (A)
2. Radiation and the
Atmosphere (B)
3. Interaction with the
Object (C)
4. Recording of Energy
by the Sensor (D)
5. Transmission,
Reception and
Processing (E)
6. Interpretation and
Analysis (F)
7. Application (G) 2
Contents
4
Remote Sensing Sensors
A sensor is a device that measures and records
electromagnetic energy.
Depending on the source of energy, sensors can be divided
into two groups: Passive sensors and Active sensors.
6
Methods of Recording of Electromagnetic Energy
8
Method of Recording of Electromagnetic Energy
Can you imagine what the world would look like if we could
only see very narrow ranges of wavelengths or colors?
15
Digital Image
A digital image is a regular grid array of squares (or rectangles).
The square is referred to as a ‘pixel’, which is a word formed
from the term ‘picture element’.
Each square is assigned a digital number (DN) which is related
to some parameter (such as reflectance or emittance measured
by a remote sensing system sensor).
16
Importance of Digital Images
18
If the visible portion of the light spectrum is
divided into thirds, the predominant colors are
red, green and blue. These three colors are
considered the primary colors of the visible light
spectrum.
Primary colors can be arranged in a circle,
commonly referred to as a color wheel. Red,
green and blue (RGB) form a triangle on the color
wheel. In between the primary colors are the
secondary colors: cyan, magenta and yellow
(CMY), which form another triangle.
19
Color Combination
20
21
Color Additive Process
Red+Green=Yellow
Blue+Green=Cyan
Red+Blue=Magenta
Red+Green+Blue=White
22
Color Subtractive Process
Color Composite
24
Digital Number: 0 -255
0 RED 255 100 200
0GREEN 255 100 200
Color Composite 0BLUE 255 100 200
255 0 0 255
0 255 0 255
0 0 255 0
0 0 0 0
255 200 100 50
0 0 0 0
25
Characteristics of Digital Image
Spatial resolution
Spectral resolution
Radiometric resolution
Temporal resolution
26
Spatial Resolution
A qualitative measure of the amount of detail that can be observed on
an image. The size of pixel sets the limit on the spatial resolution.
A measure of the size of the pixel is given by the Instantaneous Field
Of View (IFOV), which is dependent on the altitude and the viewing
angle of the sensor.
IFOV is defined as the angle which corresponds to the sampling unit.
Information within an IFOV is represented by a pixel in the image
plane.
27
28
Spatial Resolution
Meteosat-8: 15 minutes
NOAA-17: 2-14 times per day depending on latitude
Landsat-7: 16 days
SPOT-1,2,3: 26 days 30
16-day Repeat Cycle
31
Spectral resolution
34
35
Radiometric resolution for a 1-, 2-, 8- and 10-bit system
36
Characteristics of Satellite
Spaceborne remote sensing is carried out using sensors
that are mounted on satellites, space shuttle or space
station.
The monitoring capabilities of the sensor are to a large
extent determined by the parameters of the satellite’s
orbit.
The path followed by a satellite is referred to as its orbit.
37
Characteristics of Orbit
38
Characteristics of Landsat’s Orbit 39
Types of Orbit
Polar orbit: An orbit with inclination
angle between 800 and 1000.
Sun-synchronous orbit: This is a
near-polar orbit chosen in such a way
that the satellite pass the overhead at
the same time.
Geostationary orbits. This refers to
orbit in which the satellite is placed
above the equator (inclination 00) at an
altitude approximately 36,000 km.
At this distance the orbital period of the
satellite is equal to the rotational period
of the Earth.
40
Swath
As a satellite revolves around the Earth, the sensor "sees"
a certain portion of the Earth's surface. The area imaged
on the surface, is referred to as the swath.
NOAA-17: 2800 km
Landsat-7: 185 km
SPOT-5: 60 km
IKONOS: 11 km
41
Earth Observation Satellites
Low-resolution system
NOAA-17
Orbit: 812 km, 98.70 inclination, sun-synchronous
Swath width: 2800 km (FOV = 1100)
Off-nadir viewing: ±500
Revisit time: 2-14 times per day, depending on latitude
Spatial resolution: 1 km × 1 km (at nadir), 6 km × 2 km (at edge)
42
Medium-resolution system
Landsat-7
Orbit: 705 km, 98.20 inclination, sun-synchronous
Swath width: 185 km (FOV = 150)
Revisit time: 16 days
Spatial resolution: 15 m (PAN), 30 m (bands 1-5, 7), 60 m (band 6)
SPOT-1, 2, 3, 4
Orbit: 832 km, 98.70 inclination, sun-synchronous
Swath width: 60 km
Revisit time: 26 days
Spatial resolution: 10 m (PAN), 20 m (Multispectral)
43
High-resolution systems
SPOT-5
Orbit: 822 km, 98.70 inclination, sun-synchronous
Swath width: 60 km
Revisit time: 2-3 days
Spatial resolution: 5 m (PAN) 10 m (Multispectral)
IKONOS
Orbit: 681km, 98.20 inclination, sun-synchronous
Swath width: 11km
Revisit time: 1-3 days
Spatial resolution: 1m (PAN), 4 m (Multispectral)
44
Quickbird satellite
image of Banda Aceh
before Tsunami
Quickbird satellite
image of Banda
Aceh after Tsunami
45
Data Reception, Transmission, and Processing
Data acquired from satellite platforms need to be electronically
transmitted to Earth.
There are three main options for transmitting data acquired by
satellites to the surface.
47
Elements Involved in Remote Sensing
1. Energy Source or
Illumination (A)
2. Radiation and the
Atmosphere (B)
3. Interaction with the
Object (C)
4. Recording of Energy
by the Sensor (D)
5. Transmission,
Reception and
Processing (E)
6. Interpretation and
Analysis (F)
7. Application (G) 48
Contents
50
Methods of Information Extraction
51
Visual Image Interpretation
A pair of stereoscopic
aerial photographs can
be used to provide
stereoscopic vision using,
for example, a mirror
stereoscope.
52
Elements of Visual
Interpretation
53
Digital Image Processing
Digital image processing involves manipulation and
interpretation of digital images with the aid of a
computer. The main steps in digital image processing
are:
Pre-processing
Image enhancement
Image transformation
Image classification and analysis
54
Pre-processing
55
Radiometric Correction
56
Corrected Image Line Dropout
Cosmetic Correction
57
Spike Noise Line Striping
Cosmetic Correction
58
Geometric Corrections
59
Systematic or predictable distortions
Systematic distortions are well understood and
easily corrected by applying formulas derived by
modeling the sources of distortion mathematically.
60
Distortion due to Earth Rotation
61
Unsystematic or random distortion
62
Finding Ground Control Points
63
Finding Ground Control Points
64
Geo-coding
In order to complete the entire rectification process, each
pixel in the corrected image has to be assigned a new DN.
A procedure called resampling is used to determine the DN
to place in the new pixel locations of the corrected output
image. The resultant image is called a geo-coded image.
67
Contrast enhancement
Contrast enhancement
involves changing the
original values to
increase the contrast
between targets and
their backgrounds.
Histogram equalization
A uniform distribution of the input range of values across the full
range may not always be an appropriate enhancement,
particularly if the input range is not uniformly distributed.
This stretch assigns more display values (range) to the frequently
occurring portions of the histogram.
69
Satellite image Histogram of the image
(without contrast)
70
Histogram equalization Linear stretched image
stretched Image
71
Image Transformations
near infrared
VI =
visible red
near infrared − visible red
NDVI =
near infrared + visible red
73
NDVI is sometimes simply called. NVI (normalized
vegetation index).
NDVI or NVI are indicators of the intensity of biomass.
The larger the NVI is, the denser the vegetation.
74
Image classification and analysis
75
Density slicing
1. Supervised classification
2. Unsupervised classification
78