Está en la página 1de 77

MSc.

Remote Sensing Dissertation September 2002

MEASUREMENT OF SUB RESOLUTION EARTHQUAKE


DISPLACEMENT USING ASTER DATASET BASED ON THE
2001 EARTHQUAKE OF KUNLUN, CHINA

A dissertation submitted to the University College London for the award


of the degree of MSc in Remote Sensing.

Prepared by Sheikh Nasir B. Kamarudin

Supervised by Dr. J.G. Liu

UNIVERSITY COLLEGE LONDON

DEPARTMENT OF GEOGRAPHY

77
ACKNOWLEDGEMENTS

Firstly, I would like to express my gratitude to my supervisors, Dr J.G. Liu and


Dr. Philippa Mason of Earth and Resource Department at Imperial College
London, for the extensive advice throughout the progress of the study.
Thanks to Universiti Teknologi Malaysia and my Chief Department of Remote
Sensing, Universiti Teknologi Malaysia, Dr. Mazlan Hashim for his support
and encouragements for persueing my studys in United Kingdom.

I would like to thank my friend, Issa Al-Qusaimi who has helped me in my


dissertation.

In the process of building the software, I have sought a lot of advised from
Jimmy, PhD Student from Imperial College who has guided me in doing the
prototypes.

Last but not least, thanks to my family and espeacilly to my wife, Aida for their
moral support and everyone who has given their effort to improve me in
writing this dissertation.
MSc. Remote Sensing Dissertation September 2002

CONTENTS PAGE NUMBER

LIST OF FIGURES 4
LIST OF TABLES 5
ABSTRACT 6

CHAPTER 1 INTRODUCTION 7
1.1. PROJECT AIMS 7
1.2. SCOPE 8

CHAPTER 2 LITERATURE REVIEW 9


2.1. EARTHQUAKES AND PLATE TECTONICS. 9
2.2. MEASUREMENT OF SUB RESOLUTION TERRAIN
DISPLACEMENTS USING SPOT PANCHROMATIC IMAGERY 10
2.3. IMAGE RECTIFICATION. 11

CHAPTER 3 DATA DESCRIPTION 12


3.1. BACKGROUND OF KUNLUN 12

3.2. DATA PREPARATION 12

CHAPTER 4 THEORY ON THE CROSS-CORRELATION


COEFFICIENT 14
4.1. DEFINITION OF THE CORRELATION 14
14
4.2. CORRELATION COEFFICIENT
4.3. NORMALIZED CROSS CORRELATION (TEMPLATE
MATCHING BY CROSS-CORRELATION ) 16
4.3.1. DISADVANTAGES OF USING CORRELATION AND
TEMPLATE MATCHING 16
4.4. AUTOMATED GCP LOCATION 17

CHAPTER 5 METHODOLOGY 18
5.1. EARTHQUAKE DETECTION ARCHITECTURE 18
5.2. ERMAPPER FORMAT DATASET STRUCTURED 19
5.2.1. ERMAPPER SOFTWARE DEVELOPMENT KIT (SDK)
LIBRARY FUNCTION 19
5.2.2. USER DEFINED FUNCTION 20
5.3. MFC PROGRAMMING 21
5.4. AUTOMATED IMAGEODESY METHOD (BASIC ALGORITHM) 24
5.5. DISPLACEMENT MAP 27
5.6. DELTA X AND DELTA Y IMAGERY 28

2
MSc. Remote Sensing Dissertation September 2002

5.7. COEFFICIENT IMAGE 28

CHAPTER 6 ANALYSIS AND RESULTS 29


6.1. STAGE 1 TESTING THE PROTOTYPE WITH THE SAME
IMAGE 31

6.2. STAGE 2 TESTING THE PROTOTYPE WITH DIFFERENT


32
IMAGES
6.3. COMPUTATIONAL COST 35
6.4. RESULT 36
6.4.1. AREA A GREEN 36
6.4.2. AREA B ORANGE 38
6.4.3. AREA C YELLOW 40
6.4.4. AREA D PURPLE 41
6.4.5. AREA E BLUE 42
6.4.6. SUMMARY 43
6.4.7. FAULT DETECTIONS 44
6.5. DISCUSSION 45

CHAPTER 7 CONCLUSION 46

REFERENCES 47
WEB REFERENCES 48
APPENDIX A : FLOWCHART OF THE SOURCE CODE 49
APPENDIX B : ERMAPPER DATA STRUCTURE 50
APPENDIX C : AREA A GREEN 51
APPENDIX D : SOURCE CODE 52

3
MSc. Remote Sensing Dissertation September 2002

LIST OF FIGURES PAGE NUMBER


Figure 2.1 Earthquake structure formations 10
Figure 2.2 Normal Distribution Graph 11
Figure 3.1 A field photo of the damage-zone of the earthquake near
the National 109 road from Golmud to Lhasa, 200km east
of the epicentre 13
Figure 3.2 East Kunlun Mountains, Western China 14
Figure 4.1 Correlation illustrated 16
Figure 5.1 Earthquake Detection Architecture 19
Figure 5.2 Interface design for this prototype 24
Figure 5.3 Area correlation for image cross correlation 25
Figure 5.4 Inferring displacement maps between image pairs 27
Figure 5.5 Histogram Delta X and Delta Y 29
Figure 5.6 Coefficient Histogram 29
Figure 6.1 RGB Kunlun pre-Earthquake overlay vector of fault 31
Figure 6.2 Vector Map for Same Image Using L = 1 32
Figure 6.3 Area D: - sample of an area from the North-West
Mountain in Kunlun 33
Figure 6.4 A generated vector of the displacement (shift) overlay to
the RGB Kunlun before Earthquake 33
Figure 6.5 Histogram of Delta X in meter using Non Cumulative Style 34
Figure 6.6 Histogram of Delta Y in meters using Non Cumulative
Style 34
Figure 6.7 Histogram of Coefficient 35
Figure 6.8 Delta Y in Green and Delta X in Red representing False
Colour Composite and for the colour in histograms for
unrectified dataset 37
Figure 6.9 Displacement of the area A and the overlay of unrectified
dataset 38
Figure 6.10 Delta Y in Green and Delta X in Red representing False
Colour Composite and for the colour in histograms for
rectified dataset 38
Figure 6.11 Delta Y in Green and Delta X in Red representing False
Colour Composite and for the colour in histograms for
rectified dataset 39
Figure 6.12 Displacement of the area B and the overlay dataset using
pseudocolor 40
Figure 6.13 Delta Y in Green and Delta X in the histograms and RGB
FCC 321 41
Figure 6.14 Delta Y in Green and Delta X in Red representing False
Colour Composite (FCC) and for the colour in histograms
for rectified dataset 42
Figure 6.15 Displacement of the area B and the overlay dataset on
pseudocolor 42
Figure 6.16 Delta Y in Green and Delta X in Red representing False
Colour Composite (FCC) and for the colour in histograms
for rectified dataset 43
Figure 6.17 Displacement of the area E and the overlay dataset on
pseudocolor 43
Figure 6.18 Displacement Pattern 44
Figure 6.19 Fault Detection 45

4
MSc. Remote Sensing Dissertation September 2002

List of tables Pages number


Table 1 Analyst area in box 29

Table 2 Sample data of the after process using the same 31


dataset.

Table 3 Sample data for the process data using a different 34


dataset.
Table 4 Comparing computational cost – NCC (µs/ pixel) 35

5
MSc. Remote Sensing Dissertation September 2002

ABSTRACT

The rapidly growing field of remote sensing is beginning to supply massive quantities of high-
resolution imagery of the Earth and other planets. In the earth sciences, parallel
supercomputers have always played a prominent role in the visualization of this imagery, and
in other image processing applications designed to enhance and display the obtained
information. A new idea is proposed for the use of the Normalized Cross –Correlation
technique called Imageodesy to implement in the use of earthquake displacement using
optical data for a PC platform. Eventually this technique can be a substitution to the InSAR.
Furthermore, this method has the effectiveness to resolve the displacement up to a pixel
level. This project includes the usage of the method to be implemented for an add-on to ER
Mapper Application software.

6
MSc. Remote Sensing Dissertation September 2002

CHAPTER 1 INTRODUCTION

Rectification of a remotely sensed image is commonly implemented by applying polynomial


regression models to image coordinates and map coordinates of the ground control points.
The other approach of image rectification is the statistically approach, which by means of a
Ground Control Points(GCP) dataset, establishes mathematical relationship between image
coordinates using standard statistical procedures. These relationships can be used to correct
the image geometry irrespective of the analyst’s knowledge of the source and type of
distortion (Schowengerdt, 1995). On the other hand, not all geometry correction is considered
as distortion, whereby, they could be changes that occur in the earth crustal or other
phenomena. Changes to earth surface movement have always been a problem to the earth
scientist to analyst the actual foreseen changes. Temporal optical datasets of the surface of
the earth are detailed, uniform and spatially comprehensive over widely areas. The changes
are however, minimal to our naked eyes. Most distortions are attributable to sensor-attitude
variations that can be identified and removed through recognition and measurement of their
distinct spatial characteristics. This allows the possibility of isolating and revealing patterns of
the environmental change. The terrain displacements over one may seek to measure may be
a small fraction of the pixel size. In some cases, the displacement may exceed the pixel size,
but one may seek the sub pixel precision. The method which is capable is Imageodesy
(Crippen, 1992).

The basic technique of Imageodesy is to compare the “before” and “after” image at each grid
cell by iteratively interpolating one and testing its correlation with the other. The actual
algorithm will be discussed in the Methodology section. Currently, these methods are
successfully used in observing other planets to detect the surface change and for measuring
very small ground displacements.

By implementing this method for Advanced Spaceborne Thermal Emission and Reflection
Radiometer (ASTER), more of these applied algorithms could be used to improve the
detection of the displacements especially for the Earthquake since the changes are very
small. This project reports the testing of such algorithms for ASTER.

1.1. PROJECT AIMS

The principal objective of this project was to assess the feasibility of measuring earthquake
changes by sub-pixel measurement (Imageodesy).

The scientific objective is to test and verify the effectiveness of the Imageodesy technique.
This is an alternative approach to the SAR Interferometry (InSAR) technique for yielding the

CHAPTER 1 INTRODUCTION 7
MSc. Remote Sensing Dissertation September 2002

regional co-seismic deformation field (Wright et al. 1999; Peltzer et al. 1999) and is especially
sensitive to horizontal movement while InSAR is more sensitive to the vertical deformation.
Since the use of InSAR data is prohibited in this case. Imageodesy is used as an alternative.
In fact, if InSAR is available considerably GPS observations are also needed to extend the
analysis and interpretation of earthquakes (Ramon, 2001).

A prototype has been developed using Microsoft Foundation Class (MFC) Microsoft Visual
C++ 6.0 as an add-ons application ER Mapper 6.2 to show applicability of the technique
used.

The secondary objective of this research is to evaluate the result with the exemplary case of
strike-slip fault movement in a typical intra-plate tectonic zone, using the method above for
the derivation, verification and quantitative interpretation of the 2-D deformation field. A part
from that establishing the quality images for the displacement measurement will be also be
discussed later.

In order to demonstrate the feasibility of this method, an earthquake event in the Kunlun
Mountains, Western China has been selected (which will be described in Chapter 3). A
review of existing literature (Chapter 2) provides a general overview of various research
studies related to some of the aspects of this thesis. A detailed analysis of the data used
together with the current software development has been included. The principles of using
correlation in automated Ground Control Points (GCP) will be described in Chapter 4.
Chapter 5 focuses on the method and architecture of the technique. Finally, the analysis,
results and discussions chapters are given.

1.2. SCOPE

Currently this project is being shared by another MSc. Student who is working on Geology
and tectonic features and later observed the movements through observation. This project
will focus on the measurement of displacement using Panchromatic Imagery from ASTER
data. As ASTER generates images containing millions of pixels, it will be able to make a high
–resolution ground-displacement “map” showing the movement of every single pixel of the
earthquake. Since this method was design for high-end computer, images sizes are restricted
to 2000 x 2000 pixels. Currently, this prototype is built on the PC platform (WIN98, WIN ME)
and Windows NT (WINDOWS 2000 and WINDOWS XP) with some limitation, so it might not
be usable in the UNIX machine.

CHAPTER 1 INTRODUCTION 8
MSc. Remote Sensing Dissertation September 2002

CHAPTER 2 LITERATURE REVIEW

Earthquake is just one of the earth’s natural disasters. Spurred by the rising economic costs,
there has been a dramatic increase in efforts aimed at estimating the direct and indirect
losses caused by this phenomenon. When the field of earthquake prediction took off in the
mid- 1970s, seismologists had high hopes of finding geologic warning signs that would allow
them to issue timely evacuation notices. After years of inconclusive research, only a
determined few of these investigators are still engaging in the effort to predict earthquakes.
Federal support has shifted to general studies designed to minimize quake after effects.
Better estimates of potential earthquake losses is extremely challenging due to shortages in
the kind of empirical data that is needed for more accurate estimates (Kathleen 1998, Tom
1992). Remote sensing methods have also found their way into this disaster management.
One of the examples of a remote sensing application to earthquake hazards was provided by
Dr. Robert Crippen of Jet Propulsion Laboratory (JPL). Imageodesy, which is a concatenation
of “image geodesy” and a partial acronym for “Image Multitemporal Analysis Geodesy”
(Crippen, 1992) has been widely discussed. This method has several potential advantages
which could be used to detect changes.

2.1. EARTHQUAKES AND PLATE TECTONICS.

The world's earthquakes are not randomly distributed over the Earth's surface. They tend to
be concentrated in narrow zones. Plate tectonics tells us that the Earth's rigid outer shell
(lithosphere) is broken into a mosaic of oceanic and continental plates which can slide over
the plastic aesthenosphere, which is the uppermost layer of the mantle. The plates are in
constant motion. Where they interact, along their margins, important geological processes
take place, such as the formation of mountain belts, earthquakes, and volcanoes (Figure 2.1).

Figure 2.1
Earthquake structure formations
(http://neic.usgs.gov/neis/plate_tectonics/rift.html)

Plate tectonics confirms that there are four types of seismic zones. The first follows the line of
midocean ridges. Activity is low, and it occurs at very shallow depths. The point is that the

CHAPTER 2 LITERATURE REVIEW


9
MSc. Remote Sensing Dissertation September 2002

lithosphere is very thin and weak at these boundaries, so the strain cannot build up enough to
cause large earthquakes. Associated with this type of seismicity is the volcanic activity along
the axis of the ridges.

The second type of earthquake associated with plate tectonics is the shallow-focus event
unaccompanied by volcanic activity. The friction between the plates can be so great that very
large strains can build up before they are periodically relieved by large earthquakes.
Nevertheless, activity does not always occur along the entire length of the fault during any
one earthquake.

The third type of earthquake is related to the collision of oceanic and continental plates. One
plate is thrust or subducted under the other plate so that a deep ocean trench is produced.
This type of earthquake can be shallow, intermediate, or deep, according to its location on the
downgoing lithospheric slab. Such inclined planes of earthquakes are known as Benioff
zones.

The fourth type of seismic zone occurs along the boundaries of continental plates. Typical of
this is the broad swath of seismicity from Burma to the Mediterranean, crossing the
Himalayas, Iran, and Turkey, to Gilbraltar. Within this zone, shallow earthquakes are
associated with high mountain ranges where intense compression is taking place.
Intermediate- and deep-focus earthquakes also occur and are known in the Himalayas and in
the Caucasus. The interiors of continental plates are very complex, much more so than island
arcs. Kunlun, China falls in this category. In fact, these areas are still not yet known the full
relationship of the Alps or the East African rift system to the broad picture of plate tectonics
(USGS). Using imageodesy, measurements of earthquake displacement either vertical or
horizontal displacement were able to be detected because of high resolution data imagery
were used.

2.2. MEASUREMENT OF SUB RESOLUTION TERRAIN DISPLACEMENTS USING SPOT


PANCHROMATIC IMAGERY.

ì - Mean The fundamental concept of Robert


ó-Variance
Crippen’s method is based on the statistical
concept of a normal distribution with
accurate and precise measurements from a
set of measurements that may be unreliable
and even relatively crude. Imageodesy
typically compares thousands of pixels in
ì -ó ì ì+ó the before image to thousand of pixels in the
after image in determining each
Figure 2.2 Normal Distribution Graph displacement vector and as a result of

CHAPTER 2 LITERATURE REVIEW 10


MSc. Remote Sensing Dissertation September 2002

capable of detecting the high levels of spatial precision. This technique will be used to derive
the local offset between the pre-earthquake and after-earthquake imagery. The method is
capable of measuring the geographic dimensions displacements as big as a pixel size can be
measured using any satellite imageries (Crippen & Blom, 1996). If the measurement has
errors that are randomly selected then their probabilities distribution is symmetrical and bell-
shaped with a peak at the true value (Figure 2.2).

The shape of the bell could be defined sufficiently well in order to determine the peak
accurately only if enough measurements are made. Each pixel value is a measure of
radiance weighted at the sub resolution scales by the Points–spread function that is imposed
by atmosphere and sensor optics. Each pixel value is therefore variable in relation to its
geographic position, at sub resolution scales, and is thereby an indirect measure of
geographic position.

Previous tests by Robert Crippen have shown that the imageodetic method was successful in
similar image pairs by the sub resolution measurement of barchans and longitudinal sand
dune migration with reasonable measured movements (Crippen, 1992). The major drawback
of the procedure was the enormous computational demands. The computational power of the
machine used, was CRAY T3D at JPL (Paul S., 1999) using SPOT dataset.

Clearly, the problem faced is not a simple task. Measurement errors that must be overcome
(radiometric and geometric noise in the image) are not fully random, and other data
inadequacies are likely.

2.3. IMAGE RECTIFICATION.

Non-systematic distortions arise from sensor system’s attitude, velocity, and altitude, and can
be corrected only through the use of ground control points. (Jensen, 1986) Second approach
is the statistic which by means of GCP data set establishes mathematical relation between
images coordinates and their corresponding map coordinates using standard statistical
procedures. These relationships can be used to correct the image geometry irrespective of
the analyst’s knowledge of the source and the type of distortion (Richards, 1995).

CHAPTER 2 LITERATURE REVIEW 11


MSc. Remote Sensing Dissertation September 2002

CHAPTER 3 DATA DESCRIPTION

In order to illustrate the algorithms, ASTER dataset of the Kunlun Mountain, China, an area of
active earthquakes has been chosen.

3.1 BACKGROUND OF KUNLUN

A massive left-lateral strike-slip earthquake occurred on the 14 Nov 2001 at 09:26:18 UTC in
the East Kunlun Mountains, western China (Figure 3.2). Following this major earthquake,
seven aftershocks of magnitude greater than 5.0 occurred along the East Kunlun Fault zone
(EKF). The earthquake occurred in an entirely uninhabited environment above 4500 m. The
ground is frozen to the surface at present, preserving kinematics indicators along the entirety
of the fault in unusual detail. The ground will thaw in early summer, much of the area will
return to swamp, and this uniquely well-preserved
record will be lost.

This large earthquake can be regarded as a natural


tectonic experiment that provides a rare opportunity to
study motion on a large intra-plate strike-slip fault in
great detail. The EKF is one of several major fault
zones near the northern edge of the Qinghai-Tibet
Plateau which form a complex fault system
accommodating the internal clockwise rotation and
eastward motion of the block (Matte et al. 1996)
produced by the collision of India and Asia. The
displacement caused by the earthquake is expected to
be as great as 10 m, and the damage zone is longer Figure 3.1:- A field photo of the
damage-zone of the earthquake
than 200 km – Figure 3.1. The E-W trending Eastern near the National 109 road from
Kunlun Fault (EKF) is a very active intra-plate zone Golmud to Lhasa, 200km east of
the epicentre.
along which a number of major earthquakes have
occurred in recent years (e.g. Ms 7.9 Manyi earthquake, 8 Nov. 1997).

3.2 DATA PREPARATION

The ASTER sensor on board Terra-1 earth observation satellite, has 14 multi-spectral bands,
three VNIR (15m), six SWIR (30m) and five TIR (90m), plus a backward pointing band3 (NIR)
for along track stereo capacity. With the resolution of 15 m using 3 bands of NIR, the pixel
comparison will be much more details than a single ETM+ Pan band. ASTER imagery has

CHAPTER 3 DATA DESCRIPTION 12


MSc. Remote Sensing Dissertation September 2002

been used to assess the earthquake’s damage-zones and regional tectonics. There are two
set of dataset dated Oct 7, 2001 and Dec 13, 2001 were used as the image comparison.
Even though, these dataset were 2 months apart, the qualities were totally different because
of the sun-illuminations and season. Initially, the datasets were rectified to the initial GCP at
the four corners. All images contain distortions of some kind. There are many contributory
effects- different kinds of platforms in the radiation-gathering set-up, Earth rotation beneath
the platform, and both parallax and scale. Therefore, pre-earthquake imageries were rectified
to the after earthquake. Both rectified and unrectified imagery were used in the analyst for
accuracy assessments.

Figure 3.2 East Kunlun Mountains, Western China

In this project, the prototype only used Panchromatic Images for comparison. Therefore only
bands in VNIR were utilized, since ASTER does not have Panchromatic Image. Simulated
panchromatic band image was created by adding spatial property of each of the bands and
averaged them by 3 to make it equal as following;

B1 + B2 + B3 N
BPan =
3

CHAPTER 3 DATA DESCRIPTION 13


MSc. Remote Sensing Dissertation September 2002

CHAPTER 4 THEORY ON THE CROSS-CORRELATION COEFFICIENT

4.1. DEFINITION OF THE CORRELATION

The word correlation is used in everyday life to denote some form of association. An
association between variables means that the value of one variable can be predicted, to
some extent, by the value of the other. A correlation is a special kind of association: there is a
linear relation between the values of the variables. A non-linear relation can be transformed
into a linear one before the correlation is calculated (Garson). In statistical terms the
correlation denotes association between two quantitative variables. By assuming that the
association is linear, one variable increases or decreases a fixed amount for a unit increase
or decrease in the other. Correlation is a bivariate measure of association (strength) of the
relationship between two variables.

For a set of variable pairs, the correlation coefficient gives the strength of the association.
The square of the size of the correlation coefficient is the fraction of the variance of the one
variable that can be explained from the variance of the other variable. The relation between
the variables is called the regression line. The regression line is defined as the best fitting
straight line through all value pairs, i.e., the one explaining the largest part of the variance.

4.2. CORRELATION COEFFICIENT

The degree of association is measured by a correlation coefficient, denoted by r. It is


sometimes called Pearson's correlation coefficient after its originator and is a measure of
linear association. If a curved line is needed to express the relationship, other and more
complicated measures of the correlation must be used.

The correlation coefficient is measured on a scale that varies from + 1 through 0 to – 1


(eBMJ). Complete correlation between two variables is expressed by either + 1 (perfect linear
relationship) or -1(perfect negative linear relationship). It is usually reported in terms of its
2
square (r ), interpreted as percent of variance explained. When one variable increases as the
other increases the correlation is positive; when one decreases as the other increases it is
negative. Complete absence of correlation is represented by 0. See figure 4.1 gives some
graphical representations of correlation.

CHAPTER 4 THEORY ON THE CROSS-CORRELATION COEFFICIENT 14


MSc. Remote Sensing Dissertation September 2002

Figure 4.1 Correlation illustrated. (EBMJ, 2002)

The calculation of the correlation coefficient is as follows, with x representing the values of
the independent variable and y representing the values

of the

dependent. The formula to be
used is: ∑ ( x − x )( y − y )
r=

∑ ( x − x) 2 ∑ ( y − y ) 2

A part of the variation in one of the variables (as measured by its variance) can be thought of
as being due to its relationship with the other variable and another part as due to
undetermined (often "random") causes. The part due to the dependence of one
variable on the other is measured by .

CHAPTER 4 THEORY ON THE CROSS-CORRELATION COEFFICIENT 15


MSc. Remote Sensing Dissertation September 2002

4.3. NORMALIZED CROSS CORRELATION (TEMPLATE MATCHING BY CROSS-


CORRELATION )

The use of cross-correlation for template matching is motivated by the distance measure
2
(squared Euclidean distance) d , given by (J.P. Lewis)

d 2 f ,t (u , v ) = ∑ [ f ( x, y) − t ( x − u , y − v)]
2

x, y

(where f is the image and the sum is over x, y under the window containing the feature t
2
positioned at u, v). In the expansion of d

d 2 f ,t (u , v ) = ∑ [ f ( x, y ) − 2 f ( x, y )t ( x − u, y − v) + t ( x − u, y − v)]
2 2 2

x, y

∑ t ( x − u, y − v) ∑ f ( x, y) is approximately
2 2
The term is constant. If the term

constant then the remaining cross-correlation term

∑ f ( x, y )t ( x − u, y − v) (1)
2
c(u, v) =
x, y

is a measure of the similarity between the image and the feature.

4.3.1. DISADVANTAGES OF USING CORRELATION AND TEMPLATE MATCHING.

Correlation is symmetrical, and do not provide direction information. If other variables also
cause the dependent variable, then any covariance they share with the given independent
variable in a correlation, it will be falsely attributed to that independent variable. Also, to the
extent that there is a nonlinear relationship between the two variables being correlated,
correlation will understate the relationship. Correlation will also be attenuated to the extent
there is measurement error, including use of sub-interval data or artificial truncation of the
range of the data. Correlation can also be a misleading average if the relationship varies
depending on the value of the independent variable.

∑ f ( x, y) varies with position, matching using


2
For template matching, if the image energy

equation (1) can fail. For example, the correlation between the feature and an exactly
matching region in the image may be less than the correlation between the feature and a
bright spot. A part from that the range of c(u,v) is dependent on the size of the feature.
Equation (1) is not invariant to changes in image amplitude such as those caused by
changing lighting conditions across the image sequence.

CHAPTER 4 THEORY ON THE CROSS-CORRELATION COEFFICIENT 16


MSc. Remote Sensing Dissertation September 2002

The correlation coefficient overcomes these difficulties by normalizing the image and feature
vectors to unit length, yielding a cosine-like correlation coefficient

[ f ( x, y ) − f u ,v ][t ( x − u , y − v) − t ]
g (u , v ) = ∑ (2)
x, y {∑ x , y [( x, y ) − f u ,v ] 2
∑ [t ( x − u , y − v ) − t ] }
2 0.5

x, y

where t is the mean of the feature and f u, v is the mean of f(x,y) in the region under the
feature. Equation (2) is referred as normalized cross-correlation (NCC) (J.P.Lewis, 1995).
NCC has been used in the building of the prototypes. This will be discussed in the next
chapter.

4.4. AUTOMATED GCP LOCATION

The registration process involves pair of images to be precisely rectified to the same point on
the ground. Automated GCP location uses correlation method to extract spatial features from
two images and correspond the matching.

By using this method, small areas in each image (“chips” or “patches”) can serve as the
spatial features for automated registration. It is not necessary to precisely specify the location
of corresponding chips in both images, because spatial cross-correlation will be used to
determine the required shift for registration. The chip should be small enough that a simple
shift is sufficient for registration of each pair of chips. This preserved the information from
internal distortions within a chip. The differences between the shifts determined from different
chips define global rotation, skew or other misalignments between the full images.
(Schowengerdt, 1995). However, not all of the point will fit to the absolute ground truth
coordinates because of temporal data or changes.

CHAPTER 4 THEORY ON THE CROSS-CORRELATION COEFFICIENT 17


MSc. Remote Sensing Dissertation September 2002

CHAPTER 5 METHODOLOGY

This chapter described the process and techniques implemented to develop the software. A
flowchart and also programming codes are included in the Appendix A and F respectively.
Below are the lists of programs used to develop the software and to analyse the information.
1. Microsoft Visual C++ 6.0 (MFC)
2. ER Mapper Library (SDK) Version 6.0
3. ER Mapper Version 6.2
In order to understand greater details of the techniques used, it is important to understand the
structure of the prototype. Next section will describe the architecture of the Prototype and
Section 5.4 onwards will describe the frame works and details of method used.

5.1. EARTHQUAKE DETECTION ARCHITECTURE


Below is the design of the architecture for Earthquake detection (Figure 5.1). The design has
4 modules which will be discussed in the next section. Two images which covered the same
areas were loaded to the program. Once the DN array was available, features were extracted
after automated imageodesy motion. This module will process and allocate the block DN
value for calculation. Each calculation made was kept in an array. If the shift is stable or
accurate, which in this case the threshold value has been set in order to select the highest
point of correlation. After that, displacement maps and other relevant output were generated
for accuracy assessments. If the shift is not stable, inaccurate or out of specific range, it
would be ignored based on the Threshold value given on the early setting and a NULL value
was registered in the displacement maps, delta X, delta Y and coefficient that match them
point in the actual dataset location.

No STABILE/ Yes
ACCURATE

PRECISION
DISPLACEMENT
Image MAPS
Before
READING
FEATURE
Image DATASETS
EXTRACTION
DELTA X, Y &
After COEFFICIENT

AUTOMATED
IMAGEODESY

Figure 5.1: Earthquake Detection Architecture


(Modified after Paul S. et al, 1999)

CHAPTER 5 METHODOLOGY 18
MSc. Remote Sensing Dissertation September 2002

5.2. ERMAPPER FORMAT DATASET STRUCTURED

The first approach made in attempting the project was to understand the data models of ER
Mapper Raster dataset (ERS). In appendix B, diagram structure of the data models is
included for reference. In brief, an ER Mapper raster dataset is made up of two files:-

The dataset header file


• Dataset header is an ASCII file describing the raster data in the raster in the data file.
The data header file normally has the same name as the data file that is describing,
with the extension “.ERS” added.

The data file


• The data file contains the data itself. The raster data is stored in a binary Band-
Interleaved by Line (BIL) format wit a pixel data type defined in the accompanying
header file.

5.2.1. ERMAPPER SOFTWARE DEVELOPMENT KIT (SDK) LIBRARY FUNCTION

ERMapper company developed SDK purposely to enable developers to manipulate the


image file formats that supported by ER Mapper. This formats includes the alg, ers, and ecw
formats (as well as any other formats supported via raster translators).

There are many functions in the library which has been provided to perform specific task. For
examples, to manipulate a dataset header the following functions are available:-
• ea_dshdr() Allocates memory for a dataset header
• ef_dshdr() Frees memory allocated for a dataset header
• eg_dshdr() Gets (reads) a dataset header from disk
• cp_dshdr() Copies a dataset header
• ep_dshdr() Puts (writes) a dataset header to disk
• ERM_create_default_algorithm

• malloc_or_die Memory allocation/deallocation

• PE_initialise Virtual dataset and algorithm

• vds_open_alg_rgbh This function opens an Algorithm for processing to


RGB and Height values. The output data is a 32Bit
BIP format, being 8Bits each of HRGB. Data is read
using the vds_read() call.

CHAPTER 5 METHODOLOGY 19
MSc. Remote Sensing Dissertation September 2002

Each of these functions will return 0 which means the function has succeeded in performing
the task (ER Mapper, 2001).

The most important for using these functions is to set up the registry variables. For each
executable, there must be a key in:

HKEY_LOCAL_MACHINE/SOFTWARE/Earth Resource Mapping

The name of the key must be the executable name followed by "(libversion 6.0)". Inside this
key must be a string value called BASE_PATH. The value of this string must be the path to
the application runtime area (e.g. INSTALLDIR\runtime).

For example the sample readdata.exe application registry entry would need to be set up as
follows:

HKEY_LOCAL_MACHINE
SOFTWARE
Earth Resource Mapping
readdata(libversion6.0)
BASE_PATH="INSTALLDIR\redistributable\Application"

Where INSTALLDIR is the directory into which the SDK was installed. For further details
please refer to the ER Mapper website with the Key word SDK.

5.2.2. USER DEFINED FUNCTION

Part from the available functions, some functions has been created to make the process
much more efficient. These functions were created in order to reduce the size of the program.
Few of the listed functions are as below:-
• Statistic
o To call calculate the mean, variance
• Read
o To the specify areas of image to be read and kept in the memory. This function
could be used to restrict a specific search windows configured by the user. .
• SetupAlgorithmDefaults
o This function sets up basic algorithm as a default. If the dataset is not in an
algorithms format, this function sets the dimensions (in pixels) of the output to be
displayed by the algorithm.

CHAPTER 5 METHODOLOGY 20
MSc. Remote Sensing Dissertation September 2002

• int conv_ll_xy(double a, double a1, int cellinfo)


o converting the longitude or latitude information to pixel position
• double conv_xy_ll(double a, int x , int cellinfo)
o converting the pixel position to longitude or latitude information

5.3. MFC PROGRAMMING

Before starting the actual program, a set of data structure needs to be declared in the
Microsoft Visual C++ Header (*.h). Below are the data structures to keep the displacement
position (Code 1 and 2) and some of the declaration of global variables (Code 3) which will be
used to explain some of the next section.
typedef struct CorrelationHeader { Code 1
double easting_b4;
double northing_b4;
double easting_af;
double northing_af;
double cross_cof;
int shift_x;
int shift_y;
} CorrelationHeader;

typedef struct Correlation {


Code 2

double cross_cof;
int easting_b4;
int northing_b4;
int easting_af;
int northing_af;

} Correlation;

char *p_b4Name,*p_afName; //ERS dataset structure Code 3


Algorithm *p_alg_b4, *p_alg_af;
UINT8 target[6000000],temp_target[900];
UINT8 search[6000000],temp_search[900];
UINT8 deltaX;
UINT8 deltaY;
Correlation cross_cof[200];
CorrelationHeader EQ;

CHAPTER 5 METHODOLOGY 21
MSc. Remote Sensing Dissertation September 2002

After declaration of the variables, few functions were also created to read and enhance the
prototype iterative calculation as describe in the section 5.2.2. Below are the function
declarations used to allocate memory in the buffer, setting up an ERS dataset to be read in
an algorithms format, function statistic and conversion function for coordinate and pixel
position.

// Internal (private) support functions // Code 4


static void *allocate_memory(size_t size);

// Internal (private) support functions // ermapper


void SetupAlgorithmDefaults1(LPTSTR lpsz,DatasetHeader *p_dsh, int alg_height,
int alg_width ,int mode);
void read1 (Algorithm *p_alg,int nr_rows, int nr_columns,int a, int b,
int shiftm,int shiftn, int mode);

//Iteration function and calculation of coefficient


void statistic(int m,int dec, int mode);
int conv_ll_xy(double a, double a1,int cellinfo);
double conv_xy_ll(double a, int x , int cellinfo); //double b,

The most important process in the prototype is to keep the coordinate information efficiently
and accurately. All of the DN values are keep in the search and target array as declared in
Code 3. Currently Target and Search array are declared to keep 6,000,000 pixels of the size
image 2449 x 2449 each. The array size could be increased if the system has the capability
to process more than the allocated size. These values are kept in one array, which need
another command to retrieve the block of each image. The block in used will be kept in
temp_search and temp_target for the maximum block size 30 x 30 each. If the retrievable
position is incorrect, the final analysis will be incorrect as well. Many tests have been made to
achieve this. One of the tests was to compare the input DN value with the dataset individual
block by block and check the coordinate position. This test consumed a lot of time in order to
achieve a very high accuracy. These sources code has been included in Appendix D.

Parts from that, the prototype have been created dynamically to accept different setting for
different condition as discussed in the next section. The design of the interface is as shown in
Figure 5.2. The interface has been design for a friendly user interface. All of the processes
are encapsulated from the user. This includes all of the process such as creating data header
and limiting windows search.

CHAPTER 5 METHODOLOGY 22
MSc. Remote Sensing Dissertation September 2002

Displacement
Map

Block Search Size


(N) “After” Image(S)

Block Target Size “Before” Image (T)


(M)

Figure 5.2 : Interface design for this prototype.

Progress Bar Possible Shift


Location (L)

Limit Window
search Coefficient

Delta Y
Threshold Value
Delta X

Next section will described details of the engine use in the prototypes

CHAPTER 5 METHODOLOGY 23
MSc. Remote Sensing Dissertation September 2002

5.4. AUTOMATED IMAGEODESY METHOD (BASIC ALGORITHM)

Basically, the imageodesy method relies upon the fundamental statistical concept of a normal
distribution as discussed on chapter 4. Imageodesy typically compares thousands of pixels in
the “before” image to another thousands of pixels in the “after” image in determining each
displacement vector and may be capable of high accuracy at the high level of spatial
precision by repeated local registration between various portions of the two images.

In order to calculate the cross correlation, an N by N “target” chip T (“Before” Image)


sometimes called a “template” is selected in the reference image and M by M “search” chip S
(“after” Image), with M greater than N, is selected in the distorted image or change image.
The cross-correlation between the two areas is calculated by sliding the target chip over the
central L by L region of search area as in the Figure 5.3, multiplying the two arrays pixel by
pixel, and summing the result.

Target window T

Search window S

“Before” image (T) “After” image (S)

Figure 5.3 : Area correlation for image cross correlation. (Adapted from Schowengerdt 1997)

For example, 5 x 5 pixel target area (T) of “before” image and 9 x 9 search area of “after”
image, (S), are shown at the top. Each block in the target window will be shifted and its DN

CHAPTER 5 METHODOLOGY 24
MSc. Remote Sensing Dissertation September 2002

arrays re-calculated to give its correlation coefficient (Equation 3) over the search and a
possible shift location (L). After each shift over target, maximum cross-correlation coefficient
will be marked which indicates the best fit on the search area. The only requirement for
target and search areas is that the search area must be larger than the target window. Sub
pixel precision can achieved by interpolating the L by L correlation surface to estimate the
point of maximum correlation, which indicates the shift needed to register the two chips where
L is equivalents 1.

By using equation 3, the cross-correlation prevents false correlation peaks arising from the
changes in the image DN over the search area to give the normalized cross-correlation as
been discussed in section 4.3. (Richards 1995, Schowengerdt 1997, Lewis J. P 1995, Dani
P. and Chaudhuri S., 1995)

N N

=
∑ m= 1∑ n=1
(T mn − m T )( S i + m , j + n − m S )
r ij N N N N
3
(∑ m =1 ∑ n=1
(T mn − m T ) 2
∑ m =1 ∑ n=1
( S i+ m , j+ n − m S ) 2

Below is the source code (code 5) implemented using the equation above.
statistic(m,1,mode); //refer to APPENDIX D for details
Code 5
Smn=0;
Tmn=0;
double sumProduct=0;
double co_variance=0;
for (int ctr=0;ctr<m*m;ctr++)
{
Tmn=(temp_target[ctr]);
Smn=(temp_search[ctr]);
sumProduct=sumProduct+((Smn-mean_b4)*(Tmn-mean_af));
}//for loop ctr
co_variance=sumProduct;
cross_cof[jum].cross_cof =((co_variance)/(sqrt(variance_b4*variance_a)));
cross_cof[jum].easting_af =point_col_af;
cross_cof[jum].northing_af =point_row_af;
cross_cof[jum].easting_b4 = point_col_b4;
cross_cof[jum].northing_b4 =point_row_b4;
double te=cross_cof[jum].cross_cof;
te=sqrt(pow(te,2));

CHAPTER 5 METHODOLOGY 25
MSc. Remote Sensing Dissertation September 2002

5.5. DISPLACEMENT MAP

The core computation of the whole approach is the ground motion inference for a given
image pixel. For example, for even relatively small images of 2050x2050 pixels, roughly four
million vectors must be computed, raising the computational demands by four orders of
magnitude above those required for a coarse-grained vector field. On machines of the
workstation class or current advanced computer, this calculation for ground motion vectors
can be calculated in a matter of days.

Figure 5.4 : Inferring displacement maps between image pairs


(http://mishkin.jpl.nasa.gov/spacemicro/intro.html)

A vector is calculated at each node on a grid by the use of (for example) 5 x 5 square of data
from each image. The “before” and “after” image scenes are best fit matched to the nearest
pixel, as determined by visual inspection (Manual GCP) . The before image is used still as the
mapping base and is held “constant”. For each vector determination, M x M pixels of the
before image are compared statistically for maximum correlation to a moving template of N x
N pixels in the after image.

Figure 5.4 is the schematic illustration of the use of subpixel registration to infer displacement
maps between image pairs. Shown are the “before” (A) and “after” (C) images and the
displacement map generated by the Prototypes (B). The

This step generates a vector field of inferred ground motions from a pair of satellite images.
This iterative procedure is terminated when sufficient accuracy is obtained. The resulting fault
outlines can then be registered as important events. Below are the source codes to generate
each vector point which represent matching point

point (,%lf,%lf,-1,-1,-1,0).\n",(EQ.easting_b4+(2*b4cellsizex)),(EQ.northing_b4+(2*b4cellsizey))

and for the shift

CHAPTER 5 METHODOLOGY 26
MSc. Remote Sensing Dissertation September 2002

poly(,2,[%lf,%lf,%lf,%lf],1,2,0,0,0,0,255,128,128,0).\n",(EQ.easting_b4+(2*b4cellsizex)),(EQ.northing_
b4+(2*b4cellsizey)),(EQ.easting_af+(2*afcellsizex)),(EQ.northing_af+(2*afcellsizey))

Each of the iterative will generate either point or line if the coefficient value is within the
range. If the coefficient value is not in the range, vector generation will be ignored. Raw BIL
dataset will be written, output to a file according to the user saved name and the ERMapper
vector Header will be created according the dataset automatically.

In principal, there are many type of corrections that must be accounted. For example, Sun-
and view-angle differences can introduce spurious differences between scenes. In many
remote-sensing applications, these effects are extremely important.

Radiometric differences are also possible as a result of vegetation and growth and other
events. These are, in fact, often interesting processes in themselves, although they interfere
with the specific task of measuring fault motion. For the datasets used here, they are not a
major factor in any case. Yaw, pitch, and roll can also vary during a spacecraft over flight of a
selected target, but again, this turns out to be a negligible effect for our problem.

After the full array of vectors across the image is generated, trends that are clearly
attributable to differences in data collection between the before and after images are
characterized. If these trends are related to detector array distortion, then they are distinctly
constant along the satellite path (and appear as differences among entire columns pf pixels).
If they are related to attitude variation through time, then they are distinctly constant across
the satellite path (appear as differences among entire rows of pixels). If they are related to
static differences between scenes, such as a difference in scale, then they form distinctive
two-dimensional path difference in scale, then they form distinctive two-dimensional patterns
across the entire image. Thus, these trends are spatially distinct and generally can be
removed in order to isolate and reveal image differences that are due solely to ground
deformation.

5.6. DELTA X AND DELTA Y IMAGERY

After a point has been matched every iteration using NCC, Delta X ( x) and Delta Y ( y) has
been created to see the actual differences visually. Delta X ( x) and Delta Y ( y) are actually
a values of possible shifted pixels cumulated in a dataset (*.ers). Using these datasets
histogram can be used to interpret the magnitude and the angle using below equation 4 and
5.

x = x1 –x2 4

y = y1 –y2 5

á = tan-1 ( x/ y) 6

CHAPTER 5 METHODOLOGY 27
MSc. Remote Sensing Dissertation September 2002

Where subscript 1 is the DN from before “image” and subscript 2 is the DN from “after” image
and the delta X is calculated using Equation 4 and Equation 5 is used for delta Y. Angle (á) of
the shift can be calculated by using the equation 6. The process will generate similarly to
Figure 5.4 Graph if no threshold have been set.

If the peak is pointed to 0 that means


significantly, that both of images are match to
exact coordinate. However the two peaks that
are pointed to positive or negative show that the
shift either the shift are move to left or right axis.

Practically, the histogram will look like in Figure


5.5. If the coefficient value is set to a range, the
Meters
-30 -15 0 15 prototype may not generate shift. If this is the
Figure 5.5 : Histogram Delta X and case, generated Histogram will not have the
Delta Y
centroid peak. If the peak in the range of -1 are
more compare to right peak then, this mean they are more movement going leftwards for Y
axis or going downwards of X axis. This will be the opposite case if the peak is more on the
right.

5.7. COEFFICIENT IMAGE

A part from Delta X ( x) and Delta Y ( y), a coefficient histogram is also needed. Literally, the
coefficient values are between -1 and 1. For the purpose clear visual interpretation has been
ranged between 28 to 228. Threshold value which the user has specified will make the range
in the histogram above the threshold value as shown in the Figure 5.6. If the threshold value
is set at 0.70, then histogram curves will start at 58 or 198.

28 58 128 198 228

Figure 5.6 : Coefficient Histogram

CHAPTER 5 METHODOLOGY 28
MSc. Remote Sensing Dissertation September 2002

CHAPTER 6 ANALYSIS AND RESULTS

In this chapter, first sections will show the initial test performed to make the prototype
functioning properly. Section 6.2 will discuss in details about the implementation when using
the small portion of the image from “before” and “after” at the same area. This prototype has
few variables to be set first before it could be used as discussed in section 5.3. Few of these
variables (M, N, threshold and L) are to control the iteration. After the explanation on the area
of interest, the displacement trend pattern will be discussed in the result.

The main problem in building the prototypes was to read the whole image (DN Values) and
keeping its position (Coordinate setting). If the positions are not pointing to the right DN, the
vector generation will be severely mismatched. Previous attempt was made, whereby the DN
value was read in block by block for target and search windows and the possible shift location
L is the size of the block and the L value was fixed. This means that the dataset are assessed
many times. ER Mapper SDK is using multithreads which allocate the DN in buffer.
Sometimes it cause unhandled threads for unallocated memory in the buffer which may
crashes the interface. A part from that, each block only matches one point (the centre of the
block) from both images and the image is limited to 600 x 600 pixels only due to unallocated
memory.

Second attempt was made where two arrays or buffers are used by allocating up to 6 million
memory (can be allocated depending to the capability of the system itself). Currently, only 6
million of memories are allocated to keep all the DN Values for the whole image of maximum
size 2449 x 2449 pixels. The L value is changeable to the users’ dynamic input depends on
the accuracy requirements needed and computer capabilities.

In order to simplify the final results, the area of Kunlun has been divided into 5 parts which
were marked in alphabetical order as shown in Table 1 and Figure 6.1. Figure 6.1 showed the
fault that has been produced by the colleague who did the observation and vectorisation
manually over the area of Kunlun.
TABLE 1:- ANALYST AREA IN BOX

AREA COLOUR
A GREEN
B ORANGE
C YELLOW
D PURPLE
E BLUE

CHAPTER 6 ANALYSIS AND RESULT 29


MSc. Remote Sensing Dissertation September 2002

Figure 6.1 : RGB Kunlun pre-Earthquake overlay vector of fault.

CHAPTER 6 ANALYSIS AND RESULT 30


MSc. Remote Sensing Dissertation September 2002

6.1. STAGE 1 TESTING THE PROTOTYPE WITH THE SAME IMAGE.

The same small portion of the image (for example 100 x 100 pixel size) has been used as
“before” and “after” image in the prototype. Normalized cross-correlation coefficients were
calculated using equation 3. As discussed in Chapter 3, if both images are matching, then
coefficient values will be near to -1 or 1. Since both of the images are the same, each
coefficient value calculated will be equal to 1 or near to 1 as the highest correlation which will
match the point to the same or exact location.

A Vector map displacement has been


created as shown in Figure 6.2 to represent
matching points. White dots on the image
shows the point matches with the coefficient
value as high as 0.99. If there is any shift,
red arrow will be drawn on the vector map
displacement to show the shift from which
point to which points. In the table 2 as
shown below, is a sample of the value
obtained using the same images.

Figure 6.2 : Vector Map for Same Image


Using L = 1

Table 2 Sample Data of the after process using the same dataset.
Easting Northing Easting Northing Value
Before Before After After Coefficient
297684.006 4042232.203 297684.006 4042232.203 1
297909.006 4042232.203 297909.006 4042232.203 1
298134.006 4042232.203 298134.006 4042232.203 1
298359.006 4042232.203 298359.006 4042232.203 1
298584.006 4042232.203 298584.006 4042232.203 1
298809.006 4042232.203 298809.006 4042232.203 1
299034.006 4042232.203 299034.006 4042232.203 1
299259.006 4042232.203 299259.006 4042232.203 1
299484.006 4042232.203 299484.006 4042232.203 1
299709.006 4042232.203 299709.006 4042232.203 1
299934.006 4042232.203 299934.006 4042232.203 1
300159.006 4042232.203 300159.006 4042232.203 1
300384.006 4042232.203 300384.006 4042232.203 1
300609.006 4042232.203 300609.006 4042232.203 1
300834.006 4042232.203 300834.006 4042232.203 1
301059.006 4042232.203 301059.006 4042232.203 1
301284.006 4042232.203 301284.006 4042232.203 1
301509.006 4042232.203 301509.006 4042232.203 1
301734.006 4042232.203 301734.006 4042232.203 1

CHAPTER 6 ANALYSIS AND RESULT 31


MSc. Remote Sensing Dissertation September 2002

6.2. STAGE 2 TESTING THE PROTOTYPE WITH DIFFERENT IMAGES


(AREA D: - PURPLE)

The same approach as in section 6.1 have been done but with different images at the same
area. A threshold was set in order to select or search for highest correlation coefficient value
that has been specified by the user. Currently a default value of 0.7 has been set. Because
the dataset has different sun-illumination and different seasons, the variance are too huge in
different. Clearly this difference can be observed in Figure 6.3. This might cause the
coefficient value being out of range. Another threshold has been inserted in the program to
remove this problem. By putting this threshold, those points will be ignored. These features
are shown in Figure 6.3. The empty areas in the image represent ignored points which mean
the area has low correlation or out range.

Figure 6.3 : Area D: - sample of an area from the North-West Mountain in


Kunlun. (Left- Before Earthquake and Right-After Earthquake)

After the process, the prototype will


generate delta X, delta Y and coefficient
histogram with vector displacements for
the next analysis. The processes were
performed twice with rectified and
unrectified of the “before” and “after”
image at the same area. The vectors are
represented by the arrow in red (Figure
6.4) which showed the point shift from the
“before” coordinate to “after” coordinate.

Figure 6.4 : A generated vector of the


displacement (shift) overlay to the RGB
Kunlun before Earthquake.

CHAPTER 6 ANALYSIS AND RESULT 32


MSc. Remote Sensing Dissertation September 2002

Unrectified Image

-50 -40 -30 -20 -10 0 10 20 30 40 meters


Rectified Image

-50 -40 -30 -20 -10 0 10 20 30 40 meters

Figure 6.5 : Histogram of Delta X in meter using Non Cumulative Style

-40 -30 -20 -10 0 10 20 -40 -30 -20 -10 0 10 20


Meters Meters

Unrectified Image Rectified Image

Figure 6.6 : Histogram of Delta Y in meters using Non Cumulative Style

CHAPTER 6 ANALYSIS AND RESULT 33


MSc. Remote Sensing Dissertation September 2002

Table 3:- Sample data for the process data using a different dataset.

Easting Northing Easting Northing Value


Before Before After After Coefficient
297639.006 4042277.203 297594.006 4042322.203 -0.97526
297774.006 4042277.203 297819.006 4042337.203 -0.927716
298044.006 4042277.203 298044.006 4042322.203 -0.96821
298179.006 4042277.203 298209.006 4042322.203 -0.964162
298584.006 4042277.203 298599.006 4042322.203 -0.879086
298719.006 4042277.203 298674.006 4042277.203 -0.997689
298989.006 4042277.203 299004.006 4042262.203 -0.960686
299124.006 4042277.203 299109.006 4042292.203 -0.910279
299934.006 4042277.203 299904.006 4042337.203 -0.803248
300204.006 4042277.203 300249.006 4042337.203 -0.99411
297579.006 4042142.203 297579.006 4042127.203 -0.963296
297714.006 4042142.203 297729.006 4042202.203 -0.998884
298389.006 4042142.203 298359.006 4042172.203 -0.965549
298524.006 4042142.203 298494.006 4042217.203 -0.820675
298659.006 4042142.203 298629.006 4042187.203 -0.967297
297579.006 4042007.203 297549.006 4042067.203 -0.962212
297849.006 4042007.203 297849.006 4042067.203 -0.98034
297984.006 4042007.203 297924.006 4041992.203 -0.939198
298524.006 4042007.203 298464.006 4042037.203 -0.975356
298659.006 4042007.203 298674.006 4042022.203 -0.996231
298794.006 4042007.203 298779.006 4042037.203 -0.99857
298929.006 4042007.203 298899.006 4042007.203 -0.993774
299064.006 4042007.203 299049.006 4042052.203 -0.938738
299334.006 4042007.203 299349.006 4042007.203 -0.944509
299469.006 4042007.203 299499.006 4042067.203 -0.979039
299604.006 4042007.203 299634.006 4041992.203 -0.991578
299739.006 4042007.203 299724.006 4042007.203 -0.963339
302169.006 4042007.203 302184.006 4041992.203 -0.924618
297984.006 4041872.203 297969.006 4041947.203 -0.957445
298119.006 4041872.203 298089.006 4041857.203 -0.930046
298254.006 4041872.203 298239.006 4041932.203 -0.970763
298389.006 4041872.203 298419.006 4041857.203 -0.994268
300009.006 4041872.203 300054.006 4041857.203 -0.981058

Figure 6.7 : Histogram of


Coefficient

28 128 228

CHAPTER 6 ANALYSIS AND RESULT 34


MSc. Remote Sensing Dissertation September 2002

The coefficient values which have been calculated will be used to get the matching point as
shown in Table 3. This table is a sample of the selected coefficient value with the highest
matching after the process of using “before” and “after” image. Histogram (Figure 6.7) has
been generated to show the range used.

Figure 6.5 has 2 histogram of delta X cover the same area using a rectified and unrectified
image. Obviously, the shift (around -45m) on the left of the unrectified image has significantly
dropped after the rectification. Here, the histogram showed shifts to be more to right in the X
axis.

Delta Y (Figure 6.6) showed a different direction. The shift are more toward the downwards of
Y axis. The average of the shift angle is around
-45 degree to -53 degree to the south east.

The angles are calculated by taking the lowest and the highest shifts corresponding to those
shifts discussed in Figure 6.5 and Figure 6.6 using equation 6.

6.3. COMPUTATIONAL COST

NCC in computation time has to mask pixels and multiplied them. The sum of product
operation has the main effect for computational cost and then lengthens the process in the
dataset imagery. Below, Table 4 shows the comparison time for Normalized Cross-correlation
using different M and N.

TABLE 4:-COMPARING COMPUTATIONAL COST – NCC (µs/ pixel)

M N SPEED
5 9 0.10
9 15 0.27
15 30 0.35

The size of an image with 2050 x 2050 pixels will have 4202500 pixels to be calculated. In
order to get a medium accuracy with the result block by block precision, M as shown is equal
to 9, N is equal to 15 and L is equal to 9,
Therefore,
Computational Cost = [(4202500 / (M *M)) * SPEED] / (60 SEC * 60 MIN *25 HOUR)
= 1.56 days
Based on AMD Duron 1 GHz and with 384 MB ram.

CHAPTER 6 ANALYSIS AND RESULT 35


MSc. Remote Sensing Dissertation September 2002

In order to obtain high precision and accuracy on pixel by pixel basis, the prototype will need
up to 8 days of processing time based on 2050 x 2050 pixels. Due to the limited time, next
section, will discuss the result based on the medium accuracy with result block by block.

6.4. RESULT
Each area has been processed according to the located area as shown in Table 1 and Figure
6.1. Since the prototype only measured the shift pixel by pixel or block by block, the trend
pattern will be too small to observe. So, angle and magnitude are calculated based on the
maximum individually shift using histogram for visibility of pattern and reducing processing
time for the whole image using delta X and delta Y. Therefore to accomplish the final result,
all this area results will be combined. Finally the Earthquake trend patterns were drawn
manually to show the shifting as illustrated in Figure 6.9.

6.4.1. AREA A (GREEN)

The areas cover at between 283420E and 4042385N to 314094 E and 4011553 N, near the
lake and the epicentre of Kunlun China (Appendix C). The process took 30 hours (on a
machine using 1 GHz speed CPU with a memory of 384 MB) each on unrectified and
rectified dataset of “before” and after the Earthquake using M = 9 and N= 15 and L= 9.

-120 0 35 130
Meter

Figure 6.8 :- Delta Y in Green and Delta X in Red


representing False Colour Composite and for the
colour in histograms for unrectified dataset.
-120 0 35 100 130
Meter

CHAPTER 6 ANALYSIS AND RESULT 36


MSc. Remote Sensing Dissertation September 2002

Figure 6.9 :- Displacement of the area A and the overlay of unrectified dataset.

-75 -50 -25 0 25 50


Meter

Figure 6.10 :- Delta Y in Green and Delta X in Red


representing False Colour Composite and for the
colour in histograms for rectified dataset. -75 -50 -25 0 25 50
Meter

CHAPTER 6 ANALYSIS AND RESULT 37


MSc. Remote Sensing Dissertation September 2002

After the rectification the image of FCC delta X and delta Y (figure 6.8) has reduced the
amount pixel change which could be observed in Figure 6.10. By estimating the delta X and
delta Y using the histogram (Figure 6.10), the maximum possible shifts are -78 and -56
respectively based on the rectified datasets. Rectified dataset’s delta x and delta y histogram
were used because of their lower distortions less distortion and the shift are much clear.
o
Therefore the possible shift angle is 126 with 96.02m ( 6 pixel) magnitude as shown in
Figure 6.9.
Y

126o

Next other areas will only discuss on using the rectified image for the analyst of the angle and
magnitude.

6.4.2. AREA B: - ORANGE (ALLUVIAL FAN)


The areas cover at between 281098E and 4027517N to 291606 E and 4018226 N, near the
alluvial fan south of the lake. There are lots of shiftings and changes in the alluvial fan itself
as shown in Figure 6.9.

-60 0 75
Meter

Figure 6.11 :- Delta Y in Green and Delta X in Red


representing False Colour Composite and for the colour
in histograms for rectified dataset.
-60 0 75
Meter
CHAPTER 6 ANALYSIS AND RESULT 38
MSc. Remote Sensing Dissertation September 2002

By estimating the delta X and delta Y using the histogram as shown in Figure 6.11, the
maximum possible shifts were -30 and 30 respectively. Therefore the possible shift angle is
o
315 with 42.42m ( 2.8 pixel) magnitude. These shiftings can be seen in Figure 6.12.

Figure 6.12 :- Displacement of the area B and the overlay dataset using
pseudocolor.

CHAPTER 6 ANALYSIS AND RESULT 39


MSc. Remote Sensing Dissertation September 2002

6.4.3. AREA C :- YELLOW ( ALLUVIAL FAN )

Figure 6.13 :- Delta Y in Green and Delta X in the


histograms and RGB FCC 321
-60 0 60
Meter

This study area has very low correlation due to


saturation which caused this method not to be able to
detect the shifting clearly. However, this method was
successful on the top of alluvial fan.
By estimating delta X and delta Y using the histogram,
the maximum possible shifts were 30 or -30 for delta X
and -30 for delta Y. Therefore the possible shift angles
o o
-60 0 60
are between 225 to 315 with 42.42m ( 2.8 pixel) Meter

magnitude as shown in Figure 6.13.

CHAPTER 6 ANALYSIS AND RESULT 40


MSc. Remote Sensing Dissertation September 2002

6.4.4. AREA D: - PURPLE

-60
-60 00 75
75
Meter

Figure 6.14 :- Delta Y in Green and Delta X in Red


representing False Colour Composite (FCC) and for
-60 0 75
the colour in histograms for rectified dataset Meter

In Figure 6.14, area D has a lot of movement as both colour (Red and Green) are coloured in
the image which may represent more than one direction. By estimating the delta X and delta
Y using the histogram; the maximum shifts possible were 30 and -30 respectively. Therefore
o
the possible shift angle is 315 with 42.42m ( 2.8 pixel) magnitude as shown in figure 6.9.
These shifting can be seen in the Figure 6.15.

Figure 6.15 :- Displacement of the area B and the overlay dataset on pseudocolor.

41
CHAPTER 6 ANALYSIS AND RESULT
MSc. Remote Sensing Dissertation September 2002

6.4.5. AREA E :- BLUE

-70 0 69
Meter

Figure 6.16 :- Delta Y in Green and Delta X in Red


representing False Colour Composite (FCC) and
for the colour in histograms for rectified dataset
-70 0 69
Meter

Figure 6.17 :- Displacement of the area E and the overlay dataset on pseudocolor.

CHAPTER 6 ANALYSIS AND RESULT 42


MSc. Remote Sensing Dissertation September 2002

By estimating the delta X and delta Y using the histogram, the maximum shifts possible are
o
-30 and 30 respectively as shown in Figure 6.16. Therefore the possible shift angle is 315
with 42.42m ( 2.8 pixel) magnitude. These shifting can be seen in the Figure 6.17.

6.4.6 SUMMARY

Based on the observation made by another on MSc colleague he suggested that the trend
pattern that has been created and his research have similarities. This prototype could be
improved to calculate the average angle and the magnitude automatically and then draw
average pattern trend for each specific size windows to show a greater accuracy of trend
pattern (Figure 6.18). Because of some the areas have moved in many directions and
angles, the actual displacements could not be represented very accurately.

Figure 6.18 : Displacement Pattern

CHAPTER 6 ANALYSIS AND RESULT 43


MSc. Remote Sensing Dissertation September 2002

6.4.7. FAULT DETECTIONS

NCC is very sensitive to shape and edges and because of it’s feature, faults are easily be
distinguished. One obvious process of great physical interest is the motion of surface faults
during the course of the earthquake. This detection can be clearly seen from Figure 6.19.

Figure 6.19 :-Fault Detection (In Blue is the observed fault drawn and red
arrow is the vector to show fault lines.)

Observed analysis (blue vector) made were quite rigid and straight compares to the
generated vector displacements which follows the pixel by pixel changes. Significantly the
detections were much accurate. However, the final results still need the geologist to interpret
or ground truths for better understand of the features changes.

CHAPTER 6 ANALYSIS AND RESULT 44


MSc. Remote Sensing Dissertation September 2002

6.5. DISCUSSION

Normalized Cross Correlation (NCC) has been used widely because it can cope with noisy
images and uniform change in brightness. But when the scene involves partial occlusion of
objects or saturation(highlight), this sum of multiplication- based correlation techniques
sometimes fails to capture true position, while it is good to measure some slight change in
brightness (S. Kaneko, et al 2002). For the effectiveness use of measuring Sub Pixel level of
earthquakes, sun-angle and view angle differences between imagery must be avoided.
Shadows will not match between scenes due to different sun positions. Likewise, radial
distortion of topographic features in images will differ between images different view angle.
The other major source of geometric noise, results arises from the sensor attitude and
altitude differences.

These techniques provide useful tools for measuring Earthquake. As for now, the detectable
shift or movement is 15 m or equivalent to 1 pixel. The vectors provide statistically significant
result that can be modelled to remove geometric noise and thereby obtain precise
displacement directions over a wide area. Fault breaks in remote location might be located by
the vector.

From the analysis, most displacements in the area are within 1 pixel. However, due to the
mismatch of the “before” and “after” Imagery, the final results there fore are spatially biased
correlated pattern. The measured displacement may not be necessarily the displacement
itself, whereby it could be also the geometric distortion or mismatch points. The images
supposed to be rectified from the beginning in order for an accurate matching to be achieved.
Unfortunately, images are degraded due to resampling. This problem could be solved using
image differencing and removed by the correlation procedures by image masking (Crippen,
1992). In areas where radiometric change is pervasive such as agricultural fields,
subresolution matching of edges in the image can form boundaries which may be possible as
an alternative procedure. But this procedure could not be implemented yet due to time
constraint.

Overall performance, the method used in this project is considered to be successful in


measuring the angle of the displacement. This displacement however needs ground truth in
order to access and understand the change and the movement. A part from that, the
prototype could also be improved to extract more accurate information automatically. Since in
this project, final stages of pattern trends were still done manually and the prototype could
also use the pixel (DN) resampling to increase the precision.

CHAPTER 6 ANALYSIS AND RESULT 45


MSc. Remote Sensing Dissertation September 2002

CHAPTER 7 CONCLUSION

With the implementation of this method, optical data could be used to obtain earth
displacement as a substitution to InSAR. However, this method needs more analysis for
stability and improvement. Realistically, this method could be used for any earth surface
displacements and able to measure the possible shift. In fact this method has an advantage
over InSAR, because it’s higher resolution. Moreover, this method is able to measure a
horizontal or vertical displacement as small as a size of pixel. With the available of along-
track band 3B stereo in ASTER, this method could also be implemented to measure the
displacement in heights (z axis) for a realistic 3D motion. For a higher demand of accuracy
and visualization, computation cost will also increase.

46
CHAPTER 7 CONCLUSION
MSc. Remote Sensing Dissertation September 2002

REFERENCES:

Crippen R.E. and Blom, R.G. 1992., Mapping of the Horizontal Strain Field of the 1992
Landers Earthquake by Imageodesy, EOS Transactions of the American Geophysical
Union, 73(43):374.

Crippen, R. E. 1992, Measurement of subresolution terrain displacements using SPOT


panchromatic imagery. EOS Transactions of the American Geophysical 15(1): 56-61.

Crippen, R. E. and Blom, R. G.1996., Detection, measurement, visualization, and analysis of


th
seismic crustal deformation. In: Proc. 11 International Thematic Conference on Geologic
Remote Sensing, (ERIM, Ann Arbor, Michigan). 1: I-298.

Dani P. and Chaudhuri S.1995, Automated Assembling of Images: Image Montage


Preparation, Pattern Recognition 25(3):431-445.
nd
Dury S.A.1993., Image Interpretation in Geology 2 . Chapman and Hall, London. 290 pages

ER Mapper 6.0 2000, Software Development Kit, Earth Resource Mapping Pty Ltd.

Lewis J. P. 1995., Fast Template Matching, Vision Interface, p. 120-123,.

Lillesand, T.M. and Kiefer, R.W. 2000., “Remote Sensing And Image Interpretation”, Fourth
Edition John Wiley & Sons, New York. 724 pages.

Liu J.G. 2000., Smoothing Filter Band Based Intensity Modulation: a Spectral Preserve Image
Fusion Technique for improving Spatial Details. IJRS, 21, No 18, 3461-3472

Jensen, J.R. 1986., Introductory Digital Image Processing –A Remote Sensing Perspective,
Prentice Hall, Englewood Cliffs, NJ, pages 102-105

Hanssen R. F. 2001., Radar Interferometry:- Data Interpretation and error analysis, Optima
Grafische Communicatte. 308 pages

Herbert S. 1987., Artificial Intelligence: Using C. McGraw-Hill. Inc. 411 pages.

Kaneko S and Satoh Y.I.S., 2002., Using selective correlation coefficient for robust image
registration, Pattern Recognition, Pergamon p 1-9, 2002

Tierney K. 1998., Improving Earthquake Loss Estimation: Review, Assessment and


Extension of Loss Estimation Methodologies, Uni. of Delaware. pages 1-14

David. K. 1998., Programming Microsoft Visual C++. Fifth Edition, Microsoft Press. 1153
pages.

David. W. 2000., Mathematic: Simple tool for geologist. Second Edition. Chapman and Hall,
Inc. 201 pages

Richards, J. A (1995)., Remote Sensing Digital Image Analysis, Springer-Verlag, Berlin,


pages. 54-57.

Schowengerdt.R.A 1997., Remote Sensing Model and Methods for Image Processing,
Second Edition. Academic Press 522 pages.

Tsuneji S.R.R and Yukoo H. 1987., Applied Mathematic for Earth Scientist, Terra Scientific
Pub Co. Tokyo. 435 pages.

Tom A. 1992., Earthquake Prediction: Science on Shaky Ground? The Scientist 6[14]:15,

47
MSc. Remote Sensing Dissertation September 2002

WEB REFERENCES:

Paul S., Dean C, Crippen R. and Blom R. 1999


http://www.cacr.caltech.edu/Publication/annreps/annrep95/earth1.htm Photographing
Earthquakes from Space on Earth Sciences (Last seen on the 15 July 2002)

Paul S., Dean C, Crippen R. and Blom R. 1999,


http://mishkin.jpl.nasa.gov/spacemicro/SCALABLE_PAPER Scalable Scientific Datamining
on Massively Parallel Computer (Last seen on 2 June 2002)

QUAKEFINDER: Photographing Earthquakes from Space (Last seen on 2 June 2002)


http://www-aig.jpl.nasa.gov/public/mls/quakefinder/

Garson
http://www2.chass.ncsu.edu/garson/pa765/garson.htm

Ebmj
http://bmj.com/collections/statsbk/index.shtml

USGS National Earthquake Information Center


http://neic.usgs.gov/neis/plate_tectonics/rift.html

48
MSc. Remote Sensing Dissertation September 2002

APPENDIX A: FLOWCHART OF THE SOURCE CODE

START

INPUT
LIMIT

Generate Check limit &


Histogram loop until END
Finish

Output
READ M x M
Delta x image 1 to
array 1

Output
Delta y Calculate Mean,
ìT

Calculate
Output shift in
Displacement
Variance, óT
Map

Locate possible
search area

Find the
READ N x N
max rij image 2 to
array 2

Loop for
Array 3 Calculate N&
coefficient, rij Shift next
col READ M x M
from array 2
Calculate
Variance, óS

Loop for
Calculate mean, N&
ìS Shift next
row

49
MSc. Remote Sensing Dissertation September 2002

APPENDIX B: ERMAPPER DATA STRUCTURE

An ER Mapper raster data file contains binary data in Band-Interleaved by Line (BIL)
format. The type and amount of data is specifed in the header file. The figure above
illustrated the ordering of data in an ER Mapper raster data file.

50
MSc. Remote Sensing Dissertation September 2002

APPENDIX C: AREA A GREEN

51
MSc. Remote Sensing Dissertation September 2002

APPENDIX D: SOURCE CODE


// Date September 2002
// Programme made by Sheikh Nasir Kamarudin
// for Master Remote Sensing UCL
// The basic technique of Imageodesy is to compare the "before" and "after"
// image at each grid cell by iteratively interpolating one and testing its
// correlation with the other.

#include <stdio.h>
#include <iostream.h>

#include "stdafx.h"
#include "Tasking.h"
#include "math.h"
#include <ERS.h>

#include "TaskingDoc.h"
#include "TaskingView.h"

#ifdef _DEBUG
#define new DEBUG_NEW
#undef THIS_FILE
static char THIS_FILE[] = __FILE__;
#endif

double total_b4=0,
mean_b4=0,
variance_b4=0,
total_af=0,
mean_af=0,
variance_af=0;
char *p_b4Name,*p_afName;
INT32 nr_rows_b4,nr_rows_af;
INT32 nr_columns_b4, nr_columns_af;
Algorithm *p_alg_b4, *p_alg_af;
LPTSTR lpsz1,lpsz2;
UINT8 target[6000000],temp_target[900];
UINT8 search[6000000],temp_search[900];

INT deltaX[1];
INT deltaY[1];
INT cof[1];

char *p_errmsg_b4 =NULL;


INT32 row_b4,column_b4;
Correlation cross_cof[200];
CorrelationHeader EQ;

int part_x, part_y, balance_x, balance_y, mid_x, mid_y;


FILE *testfile; // a pointer to the output file
FILE *outputfile1; // a pointer to the output file

double afeasting ;
double afnorthing;
double afcellsizex;
double afcellsizey;

double b4easting ;
double b4northing;
double b4cellsizex;
double b4cellsizey;
double ll_col_end=0, ll_row_end=0;

int mode = 1,stop=0,nr_x,nr_y;


CString m_save1,m_delta1,m_deltaY1,m_coff1;

52
MSc. Remote Sensing Dissertation September 2002

CString dirsave;
int m_nTimer;
int m_nCount;
enum {nMaxCount =1000};

char
*datum,*projection,*b_order="LSBFirst",*CellTy="Unsigned8BitInteger",*CellTySign="Signed8BitInteger";
double rotate;
CoordSysType CoordType;

//variable declaration ....end

// Internal (private) support functions //


static void *allocate_memory(size_t size);
static void handle_error(char *format,...);

// Internal (private) support functions // ermapper


void updatelpsz1(CString m_edit1);
void updatelpsz2(CString m_edit2);

void SetupAlgorithmDefaults1(LPTSTR lpsz,DatasetHeader *p_dsh, int alg_height, int alg_width,int mode);


void read1 (Algorithm *p_alg,int nr_rows, int nr_columns,int a, int b,int shiftm,int shiftn, int mode);

void statistic(int m,int dec, int mode);


int conv_ll_xy(double a, double a1,int cellinfo);
double conv_xy_ll(double a, int x , int cellinfo); //double b,

/////////////////////////////////////////////////////////////////////////////
// CTaskingView

IMPLEMENT_DYNCREATE(CTaskingView, CFormView)

BEGIN_MESSAGE_MAP(CTaskingView, CFormView)
//{{AFX_MSG_MAP(CTaskingView)
ON_BN_CLICKED(IDC_CBONIDLE1, OnCbonidle)
ON_BN_CLICKED(IDC_CBTHREAD1, OnCbthread1)
ON_BN_CLICKED(IDC_CBTHREAD2, OnCbthread2)
ON_WM_DESTROY()
ON_BN_CLICKED(ID_BROWSE1, OnBrowse1)
ON_BN_CLICKED(ID_BROWSE, OnBrowse)
ON_BN_CLICKED(IDC_SAVE, OnSave)
ON_BN_CLICKED(IDC_CHECK1, OnCheck1)
ON_BN_CLICKED(IDCANCEL2, OnCancel2)
ON_BN_CLICKED(IDOK2, OnOk)
ON_EN_CHANGE(IDC_EDIT3, OnChangeEdit3)
ON_EN_CHANGE(IDC_EDIT2, OnChangeEdit2)
ON_EN_CHANGE(IDC_EDIT4, OnChangeEdit4)
ON_EN_CHANGE(IDC_EDIT5, OnChangeEdit5)
ON_EN_CHANGE(IDC_EDIT6, OnChangeEdit6)
ON_EN_CHANGE(IDC_EDIT1, OnChangeEdit1)
ON_BN_CLICKED(IDC_SAVE2, OnDeltaSave)
ON_BN_CLICKED(IDC_SAVE3, OnSave3)
ON_EN_CHANGE(IDC_EDIT7, OnChangeEdit7)
ON_EN_CHANGE(IDC_EDIT8, OnChangeEdit8)
ON_EN_CHANGE(IDC_EDIT9, OnChangeEdit9)
ON_EN_CHANGE(IDC_EDIT10, OnChangeEdit10)
ON_EN_CHANGE(IDC_EDIT12, OnChangeEdit12)
ON_EN_CHANGE(IDC_EDIT11, OnChangeEdit11)
ON_BN_CLICKED(IDC_SAVE4, OnCoff)
ON_BN_CLICKED(IDC_CBONIDLE2, OnCbonidle)
ON_EN_CHANGE(IDC_EDIT13, OnChangeShift)
//}}AFX_MSG_MAP
// Standard printing commands
ON_COMMAND(ID_FILE_PRINT, CFormView::OnFilePrint)
ON_COMMAND(ID_FILE_PRINT_DIRECT, CFormView::OnFilePrint)
ON_COMMAND(ID_FILE_PRINT_PREVIEW, CFormView::OnFilePrintPreview)
END_MESSAGE_MAP()

53
MSc. Remote Sensing Dissertation September 2002

/////////////////////////////////////////////////////////////////////////////
// CTaskingView construction/destruction

CTaskingView::CTaskingView()
: CFormView(CTaskingView::IDD)
{
//{{AFX_DATA_INIT(CTaskingView)
m_bOnIdle1 = FALSE;
m_bOnIdle2 = FALSE;
m_bThread1 = FALSE;
m_bThread2 = FALSE;
m_edit1 = _T("");
m_endLat = 0.0;
m_edit2 = _T("");
m_edit3 = 9;
m_edit4 = 15;
m_edit5 = 0.7;
m_save = _T("");
m_startLong = 0.0;
m_startLat = 0.0;
m_endLong = 0.0;
m_checklimit = FALSE;
m_delta = _T("");
m_deltaY = _T("");
SHIFTED = 5;
m_coffiecient = _T("");
//}}AFX_DATA_INIT
// TODO: add construction code here

CTaskingView::~CTaskingView()
{
}

void CTaskingView::DoDataExchange(CDataExchange* pDX)


{
CFormView::DoDataExchange(pDX);
//{{AFX_DATA_MAP(CTaskingView)
DDX_Control(pDX, IDC_PROGRESS2, m_progress2);
DDX_Check(pDX, IDC_CBONIDLE1, m_bOnIdle1);
DDX_Check(pDX, IDC_CBONIDLE2, m_bOnIdle2);
DDX_Check(pDX, IDC_CBTHREAD1, m_bThread1);
DDX_Check(pDX, IDC_CBTHREAD2, m_bThread2);
DDX_Text(pDX, IDC_EDIT1, m_edit1);
DDX_Text(pDX, IDC_EDIT10, m_endLat);
DDX_Text(pDX, IDC_EDIT2, m_edit2);
DDX_Text(pDX, IDC_EDIT3, m_edit3);
DDV_MinMaxInt(pDX, m_edit3, 1, 20);
DDX_Text(pDX, IDC_EDIT4, m_edit4);
DDV_MinMaxInt(pDX, m_edit4, 1, 30);
DDX_Text(pDX, IDC_EDIT5, m_edit5);
DDX_Text(pDX, IDC_EDIT6, m_save);
DDX_Text(pDX, IDC_EDIT7, m_startLong);
DDX_Text(pDX, IDC_EDIT8, m_startLat);
DDX_Text(pDX, IDC_EDIT9, m_endLong);
DDX_Check(pDX, IDC_CHECK1, m_checklimit);
DDX_Text(pDX, IDC_EDIT11, m_delta);
DDX_Text(pDX, IDC_EDIT12, m_deltaY);
DDX_Text(pDX, IDC_EDIT13, SHIFTED);
DDV_MinMaxInt(pDX, SHIFTED, 1, 15);
DDX_Text(pDX, IDC_EDIT14, m_coffiecient);
//}}AFX_DATA_MAP
}

BOOL CTaskingView::PreCreateWindow(CREATESTRUCT& cs)


{
// TODO: Modify the Window class or styles here by modifying
// the CREATESTRUCT cs

return CFormView::PreCreateWindow(cs);
}

void CTaskingView::OnInitialUpdate()
{

54
MSc. Remote Sensing Dissertation September 2002

CFormView::OnInitialUpdate();
GetParentFrame()->RecalcLayout();
ResizeParentToFit();

/////////////////////////////////////////////////////////////////////////////
// CTaskingView printing

BOOL CTaskingView::OnPreparePrinting(CPrintInfo* pInfo)


{
// default preparation
return DoPreparePrinting(pInfo);
}

void CTaskingView::OnBeginPrinting(CDC* /*pDC*/, CPrintInfo* /*pInfo*/)


{
// TODO: add extra initialization before printing
}

void CTaskingView::OnEndPrinting(CDC* /*pDC*/, CPrintInfo* /*pInfo*/)


{
// TODO: add cleanup after printing
}

void CTaskingView::OnPrint(CDC* pDC, CPrintInfo* /*pInfo*/)


{
// TODO: add customized printing code here
}

/////////////////////////////////////////////////////////////////////////////
// CTaskingView diagnostics

#ifdef _DEBUG
void CTaskingView::AssertValid() const
{
CFormView::AssertValid();
}

void CTaskingView::Dump(CDumpContext& dc) const


{
CFormView::Dump(dc);
}

CTaskingDoc* CTaskingView::GetDocument() // non-debug version is inline


{
ASSERT(m_pDocument->IsKindOf(RUNTIME_CLASS(CTaskingDoc)));
return (CTaskingDoc*)m_pDocument;
}
#endif //_DEBUG

/////////////////////////////////////////////////////////////////////////////
// CTaskingView message handlers

void CTaskingView::OnCbonidle()
{
// TODO: Add your control notification handler code here

///////////////////////
// MY CODE STARTS HERE
///////////////////////

// Sync the variables with the dialog


UpdateData(TRUE);

///////////////////////
// MY CODE ENDS HERE
///////////////////////
}

55
MSc. Remote Sensing Dissertation September 2002

void CTaskingView::OnCbthread1()
{
// TODO: Add your control notification handler code here

///////////////////////
// MY CODE STARTS HERE
///////////////////////

// Sync the variables with the dialog


UpdateData(TRUE);

// Get a pointer to the document


CTaskingDoc* pDocWnd = (CTaskingDoc*)GetDocument();
// Did we get a valid pointer?
ASSERT_VALID(pDocWnd);

// Suspend or start the spinner thread


pDocWnd->SuspendSpinner(0, m_bThread1);

///////////////////////
// MY CODE ENDS HERE
///////////////////////
}

void CTaskingView::OnCbthread2()
{
// TODO: Add your control notification handler code here

///////////////////////
// MY CODE STARTS HERE
///////////////////////

// Sync the variables with the dialog


UpdateData(TRUE);

// Get a pointer to the document


CTaskingDoc* pDocWnd = (CTaskingDoc*)GetDocument();
// Did we get a valid pointer?
ASSERT_VALID(pDocWnd);

// Suspend or start the spinner thread


pDocWnd->SuspendSpinner(1, m_bThread2);

///////////////////////
// MY CODE ENDS HERE
///////////////////////
}

void CTaskingView::OnDestroy()
{
CFormView::OnDestroy();

// TODO: Add your message handler code here

///////////////////////
// MY CODE STARTS HERE
///////////////////////

// Is the first thread running?


if (m_bThread1)
{
// Specify to stop the first thread
m_bThread1 = FALSE;
// Get a pointer to the document
CTaskingDoc* pDocWnd = (CTaskingDoc*)GetDocument();
// Did we get a valid pointer?
ASSERT_VALID(pDocWnd);

// Suspend the spinner thread


pDocWnd->SuspendSpinner(0, m_bThread1);
}
// Is the second thread running?
if (m_bThread2)
{
// Specify to stop the second thread

56
MSc. Remote Sensing Dissertation September 2002

m_bThread2 = FALSE;
// Get a pointer to the document
CTaskingDoc* pDocWnd = (CTaskingDoc*)GetDocument();
// Did we get a valid pointer?
ASSERT_VALID(pDocWnd);

// Suspend the spinner thread


pDocWnd->SuspendSpinner(1, m_bThread2);
}

///////////////////////
// MY CODE ENDS HERE
///////////////////////
}

static void handle_error(char *format, ...)


{

static void *allocate_memory(size_t size)


{
void *p_memory=malloc(size);
if (p_memory==NULL){
handle_error ("Could not allocate %d bytes", size);
}
return (p_memory);
}

void read1 (Algorithm *p_alg, int nr_rows, int nr_columns,int col_quit,


int row_quit,int col_x,int row_x, int mode)
{

VDataset *pVds = (VDataset *)NULL;


int iRowIndex;
INT32 iColIndex;
char *pBufArray = (char *)NULL;
RasterLineSeg **ppRlsegArray = (RasterLineSeg **)NULL;
BOOLEAN process_vectors=TRUE;
char *pErrMsg = (char *)NULL;
UINT8 *pBuf = (UINT8 *)NULL;
UINT32 *p_buffer = NULL;

int cy;
int rval;
int band=1;
int tempx=col_x;
int tempy=row_x;
UINT8 TEMO;

if (p_alg == (Algorithm *)NULL) return;

rval = PE_initialise("Quake Finder 1.0");


if ( rval != ERS_SUCCESS) {
//exit(1);
handle_error ("Could not initialise bytes");
} // if //
pVds = vds_open_alg_rgbh(p_alg, &pErrMsg);
ASSERT(pVds != NULL);

if (pVds == (VDataset *)NULL) {


//exit(1);
handle_error ("Data set is NULL bytes");
} else {
// vds_read() will return a buffer of RGBh values which represent
// the output generated by the currently loaded algorithm, Read
// algorithm line by line

57
MSc. Remote Sensing Dissertation September 2002

cy=0;
p_buffer=(UINT32 *)allocate_memory(nr_columns * sizeof(UINT32));
for (iRowIndex = 0;((iRowIndex<nr_rows)); ++iRowIndex) {

if (vds_read (pVds,(char **) &p_buffer,(RasterLineSeg **)NULL,FALSE) )


{
handle_error ("Data set is null 1 bytes");
} else {

for (iColIndex = 0;((iColIndex<nr_columns)); ++iColIndex) {


TEMO=p_buffer[iColIndex]>>DFLT_RED_SHIFT &0xff;

if ((iRowIndex==row_x)&&(iColIndex==col_x))
{

if (mode==1)

target[cy]=p_buffer[iColIndex]>>DFLT_RED_SHIFT &0xff;
else if (mode==2)

search[cy]=p_buffer[iColIndex]>>DFLT_RED_SHIFT &0xff;

cy=cy+1;
col_x++;

if ((col_x>=nr_columns)||(col_x>=col_quit))
{row_x++;
col_x=tempx;
if ((row_x>=row_quit))
return;
}
else if ((row_x>=row_quit))
return;

}
} //end for else if for Vds Open
}//end for loop icolindex
} // end for loop irowIndex

} // if pVds not null //


free(p_buffer);
vds_close(&pVds);

//**********************************************************
//********************statistic*****************************
//**********************************************************

}
void SetupAlgorithmDefaults1( LPTSTR lpsz,DatasetHeader *p_dsh, int alg_height, int alg_width, int mode)
///////////////////////////////////////////////////////////////////////////////////
// FUNCTION: SetupAlgorithmDefaults
//
// PURPOSE:
// This function sets up basic algorithm defaults. Most importantly it sets the
// dimensions (in pixels) of the output to be displayed by the algorithm.
///////////////////////////////////////////////////////////////////////////////////
{

Algorithm *m_pAlgorithm;

//******************************************************************
//read algotithms file and check the file algorithms for after image
//******************************************************************
if (EXTENT_IS_ALG(lpsz)) {

if (eg_algorithm (&m_pAlgorithm, lpsz) !=ERS_SUCCESS) {


handle_error("Error read %s", lpsz);
}//end if
}
else {// create am algorithms based on the image
double limit_clip_percentage;
double transform_clip_percentage;

58
MSc. Remote Sensing Dissertation September 2002

if (p_dsh->u.ri.celltype==CT_UINT8) {
limit_clip_percentage =0.0;
transform_clip_percentage=0.0;
}
else
{//rescale
limit_clip_percentage=100.0;
//apply a 99% clip
transform_clip_percentage =99.0;
}//end if //else

//create the algorithms

if (ERM_create_algorithm (&m_pAlgorithm,lpsz,
limit_clip_percentage,

transform_clip_percentage)!=ERS_SUCCESS)

{
handle_error ("Unable to create algorithms: %s",ERS_error_text());
}
}

//m_pAlgorithm->canvas_width = (INT32)alg_width;
//m_pAlgorithm->canvas_height= (INT32)alg_height;

m_pAlgorithm->output_width = (INT32)alg_width;
m_pAlgorithm->output_height= (INT32)alg_height;
m_pAlgorithm->x_dpi = 72.0;
m_pAlgorithm->y_dpi = 72.0;
m_pAlgorithm->colour24flag = TRUE;
setup_color24(m_pAlgorithm);
m_pAlgorithm->enforce_aspect_ratio=1;
adjust_for_aspect(m_pAlgorithm,m_pAlgorithm->x_dpi,m_pAlgorithm->y_dpi);
m_pAlgorithm->do_compilation=FALSE;
m_pAlgorithm->processvectors=PV_NONE;

if (mode== 1)
p_alg_b4=m_pAlgorithm;
else if (mode ==2)
p_alg_af=m_pAlgorithm;

return;

void statistic(int m, int dec, int mode)


{
double variance=0;
double total=0;
double max=0, min=0;
if (mode==1)
{
max=target[0];
min=target[0];
for (int i=0;i <(m*m);i++)
{ total=temp_target[i]+total;

mean_b4 = total/(m*m);
}
else if (mode ==2)
{
max=search[0];
min=search[0];
for (int i=0;i <(m*m);i++)

59
MSc. Remote Sensing Dissertation September 2002

{ total=temp_search[i]+total;

mean_af = total/(m*m);
}

for (int i=0;i<m*m;i++)


{
if(mode ==1)
variance = variance + pow((temp_target[i]-mean_b4),2);
else if (mode ==2)
variance = variance + pow((temp_search[i]-mean_af),2);
}

if(mode ==1)
variance_b4 = variance;
else if (mode ==2)
variance_af = variance;
}

// a for the topleft coordinate,


// b bottom right coordinate
// x for the x or y coordinate needs to convert to ll
// cellinfo for pixel size eg. 15 metres double b,

int conv_ll_xy(double a, double a1, int cellinfo)


{
return (( a1-a)/cellinfo);
}

// a for the topleft coordinate, b bottom right coordinate


// x for the x or y coordinate needs to convert to ll
// cellinfo for pixel size eg. 15 metres
double conv_xy_ll(double a, int x , int cellinfo)
{
return (a + (x*cellinfo));
}

void CTaskingView::OnBrowse1()
{
// TODO: Add your control notification handler code here

UpdateData();
CFileDialog dlg(TRUE, "ers","*.ers");
if (dlg.DoModal() ==IDOK){
m_edit1=dlg.GetPathName();
CFile file;
updatelpsz1(m_edit1);

m_startLong=b4easting;
m_startLat=b4northing;
m_endLong=b4easting + (b4cellsizex*100);
m_endLat=b4northing+ (b4cellsizey*100);
}

UpdateData(FALSE);

void CTaskingView::OnBrowse()
{
// TODO: Add your control notification handler code here
UpdateData();
CFileDialog dlg(TRUE, "ers","*.ers");
if (dlg.DoModal() ==IDOK){
m_edit2=dlg.GetPathName();
CFile file;
updatelpsz2(m_edit2);
}

60
MSc. Remote Sensing Dissertation September 2002

UpdateData(FALSE);
}

void updatelpsz1(CString m_edit1)


{
DatasetHeader *p_b4Dsh ;

//Get the header info

//**********************************************************
//**********************************************************

//... modify lpsz as much as you want


lpsz1 = new TCHAR[m_edit1.GetLength()+1];
_tcscpy(lpsz1, m_edit1);
//before images
if (eg_dshdr (&p_b4Dsh, lpsz1) !=ERS_SUCCESS){
handle_error ("Unable to get the image header before EQ:%s",
ERS_error_text());
}//end if
//before images
nr_rows_b4=p_b4Dsh->u.ri.nr_lines;
nr_columns_b4=p_b4Dsh->u.ri.cells_per_line;
b4easting =p_b4Dsh->u.ri.origin.en.eastings;
b4northing=p_b4Dsh->u.ri.origin.en.northings;
b4cellsizex= p_b4Dsh->u.ri.cellinfo.x;
b4cellsizey= p_b4Dsh->u.ri.cellinfo.y*-1;
SetupAlgorithmDefaults1(lpsz1,p_b4Dsh,nr_rows_b4,nr_columns_b4,1);

void updatelpsz2(CString m_edit2)


{
DatasetHeader *p_afDsh;
//... modify lpsz as much as you want
lpsz2 = new TCHAR[m_edit2.GetLength()+1];
_tcscpy(lpsz2, m_edit2);
//after images
if (eg_dshdr (&p_afDsh, lpsz2) !=ERS_SUCCESS){
handle_error ("Unable to get the image header after EQ:%s",
ERS_error_text());
}//end if

nr_columns_af=p_afDsh->u.ri.cells_per_line;
nr_rows_af=p_afDsh->u.ri.nr_lines;
afeasting =p_afDsh->u.ri.origin.en.eastings;
afnorthing=p_afDsh->u.ri.origin.en.northings;
afcellsizex= p_afDsh->u.ri.cellinfo.x;
afcellsizey= p_afDsh->u.ri.cellinfo.y*-1;
datum=p_afDsh->cspace.datum;
projection=p_afDsh->cspace.projection;
rotate=p_afDsh->cspace.rotation;

CoordSysType CoordType=p_afDsh->cspace.coordsystype;
SetupAlgorithmDefaults1(lpsz2,p_afDsh,nr_rows_af,nr_columns_af,2);

void CTaskingView::OnSave()
{
// TODO: Add your control notification handler code here
UpdateData();
CFileDialog dlg(FALSE, "erv","*.erv");
if (dlg.DoModal() ==IDOK){
m_save=dlg.GetPathName();
int x=m_save.GetLength();
m_save1=m_save.Left(x-4);
CFile file;
}
UpdateData(FALSE);
}

61
MSc. Remote Sensing Dissertation September 2002

void CTaskingView::OnCheck1()
{
// TODO: Add your control notification handler code here

if (m_checklimit==0)
{
GetDlgItem(IDC_EDIT7)->EnableWindow(TRUE);
GetDlgItem(IDC_EDIT8)->EnableWindow(TRUE);
GetDlgItem(IDC_EDIT9)->EnableWindow(TRUE);
GetDlgItem(IDC_EDIT10)->EnableWindow(TRUE);
GetDlgItem(IDC_STATICSL)->EnableWindow(TRUE);
GetDlgItem(IDC_STATICSLA)->EnableWindow(TRUE);
GetDlgItem(IDC_STATICELA)->EnableWindow(TRUE);
GetDlgItem(IDC_STATICEL)->EnableWindow(TRUE);

}
if (m_checklimit==1)
{

GetDlgItem(IDC_EDIT7)->EnableWindow(FALSE);
GetDlgItem(IDC_EDIT8)->EnableWindow(FALSE);
GetDlgItem(IDC_EDIT9)->EnableWindow(FALSE);
GetDlgItem(IDC_EDIT10)->EnableWindow(FALSE);
GetDlgItem(IDC_STATICSL)->EnableWindow(FALSE);
GetDlgItem(IDC_STATICSLA)->EnableWindow(FALSE);
GetDlgItem(IDC_STATICELA)->EnableWindow(FALSE);
GetDlgItem(IDC_STATICEL)->EnableWindow(FALSE);
}
// TODO: Add your control notification handler code here
UpdateData();
//UpdateData(FALSE);
}

void CTaskingView::OnCancel2()
{
// TODO: Add your control notification handler code here

exit(0);
}

void CTaskingView::OnOk()
{

//*************The starting of all the proccess*******************


//***************that check for all the information need are *****
//******************enough to start the current process***********
//****************************************************************

int shift_col_limit=0;
int shift_row_limit=0;

int test_value=0;
MSG Message;

if (SHIFTED>m_edit3)
{
UpdateData(TRUE);
MessageBox("Invalid Shift Value Enter","ERROR",MB_OK);
test_value=1;
}

if
((m_edit1=="")&&(m_edit2=="")&&(m_save=="")&&(m_delta="")&&(m_deltaY=="")&&(test_value==0)&&(m_coffiecien
t==""))
{
MessageBox("Please insert the file name","ERROR",MB_OK);
}
else
{

62
MSc. Remote Sensing Dissertation September 2002

stop=1;
m_nTimer=SetTimer(1,100,NULL);
ASSERT(m_nTimer!=0);

UpdateData();

updatelpsz1(m_edit1);
updatelpsz2(m_edit2);

GetDlgItem(IDOK2)->EnableWindow(FALSE);
GetDlgItem(ID_BROWSE1)->EnableWindow(FALSE);
GetDlgItem(IDC_SAVE)->EnableWindow(FALSE);

GetDlgItem(ID_BROWSE)->EnableWindow(FALSE);
GetDlgItem(IDC_EDIT1)->EnableWindow(FALSE);
GetDlgItem(IDC_EDIT2)->EnableWindow(FALSE);
GetDlgItem(IDC_EDIT3)->EnableWindow(FALSE);
GetDlgItem(IDC_EDIT4)->EnableWindow(FALSE);
GetDlgItem(IDC_EDIT5)->EnableWindow(FALSE);
GetDlgItem(IDC_EDIT6)->EnableWindow(FALSE);
GetDlgItem(IDC_EDIT11)->EnableWindow(FALSE);
GetDlgItem(IDC_EDIT12)->EnableWindow(FALSE);
GetDlgItem(IDC_EDIT14)->EnableWindow(FALSE);

//**********************************************************
//**********************************************************
double b4easting_end=b4easting+(nr_columns_b4*b4cellsizex);
double afeasting_end=afeasting+(nr_columns_af*afcellsizex);
double b4northing_end=b4northing+(nr_rows_b4*b4cellsizey);
double afnorthing_end=afnorthing+(nr_rows_af*afcellsizey);

if (b4easting_end>=afeasting_end) ///makna nya b4 easting = shift_x_col


// makna nya shift_x_col tamat pada
if (b4northing_end<=afnorthing_end)
{

ll_col_end=afeasting_end; // after image easting


ll_row_end=afnorthing_end; //after image northing
}
else
{
ll_col_end=afeasting_end; // after image easting
ll_row_end=b4northing_end; //after image northing
}
else

if (b4northing_end<=afnorthing_end)
{
ll_col_end=b4easting_end; // after image easting
ll_row_end=afnorthing_end; //after image northing
}
else
{
ll_col_end=b4easting_end; // after image easting
ll_row_end=b4northing_end; //after image northing

//**************The source below control the *********************


//***************out put file for all the data *******************
//******************that needs to be process**********************
//****************************************************************
FILE *outputfile; // a pointer to the output file
outputfile = fopen(m_save, "w"); // open text2.txt
fprintf(outputfile, "DatasetHeader Begin \n");
fprintf(outputfile, "\tVersion = \"6.2\" \n ");
fprintf(outputfile, "\tName = \"test2.erv\" \n ");
fprintf(outputfile, "\tLastUpdated = Sun Jul 21 17:38:54 GMT 2002\n");
fprintf(outputfile, "\tDataSetType = None \n");
fprintf(outputfile, "\tDataType = Vector \n");

63
MSc. Remote Sensing Dissertation September 2002

fprintf(outputfile, "\tByteOrder = LSBFirst \n");


fprintf(outputfile, "\tCoordinateSpace Begin\n");
fprintf(outputfile, "\t\t Datum = \"%s\"\n",datum);
fprintf(outputfile, "\t\t Projection= \"%s\"\n",projection);
fprintf(outputfile, "\t\t CoordinateType = EN \n");
fprintf(outputfile, "\t\t Units = \"METERS\" \n");
fprintf(outputfile, "\t\t Rotation = 0:0:0.0\n");
fprintf(outputfile, "\tCoordinateSpace End\n");

fprintf(outputfile, "\tVectorInfo Begin\n");


fprintf(outputfile, "\t\t Type = ERVEC \n");
fprintf(outputfile, "\t\t FileFormat = ASCII\n");
fprintf(outputfile, "\t\t Extents Begin\n");
fprintf(outputfile, "\t\t\t TopLeftCorner Begin \n");
fprintf(outputfile, "\t\t\t\t Eastings = %lf \n",b4easting);
fprintf(outputfile, "\t\t\t\t Northings = %lf \n",b4northing);
fprintf(outputfile, "\t\t\tTopLeftCorner End\n");
fprintf(outputfile, "\t\t\tBottomRightCorner Begin\n");
fprintf(outputfile, "\t\t\t\tEastings = %lf\n",ll_col_end);
fprintf(outputfile, "\t\t\t\t Northings = %lf\n",ll_row_end);
fprintf(outputfile, "\t\t\tBottomRightCorner End\n");
fprintf(outputfile, "\t\tExtents End\n");
fprintf(outputfile, "\tVectorInfo End\n");
fprintf(outputfile, "DatasetHeader End\n");
fclose(outputfile);

outputfile1 = fopen(m_save1, "w"); // open text2.txt


testfile = fopen("c:\\crossvalue.txt","w"); // open text2.txt
outputfile1 = fopen(m_save1, "w"); // open text2.txt
testfile = fopen("c:\\crossvalue.txt","w"); // open text2.txt

CString pszFileName = m_delta1,pszFileNameY = m_deltaY1,pszFileNameCof = m_coff1;


CFile myFile,myFileY,myFileCof;
CFileException fileException,fileExceptionY,fileExceptionCof;

if ( !myFileCof.Open( pszFileNameCof, CFile::modeCreate |


CFile::modeReadWrite ), &fileExceptionCof )
{
TRACE( "Can't open file %s, error = %u\n",
pszFileNameCof, fileExceptionCof.m_cause );
}

if ( !myFile.Open( pszFileName, CFile::modeCreate |


CFile::modeReadWrite ), &fileException )
{
TRACE( "Can't open file %s, error = %u\n",
pszFileName, fileException.m_cause );
}

if ( !myFileY.Open(pszFileNameY, CFile::modeCreate |
CFile::modeReadWrite ), &fileExceptionY )
{
TRACE( "Can't open file %s, error = %u\n",
pszFileNameY, fileExceptionY.m_cause );
}

int n=m_edit4, shift_col_af, shift_row_af;


int m=m_edit3, shift_col_b4, shift_row_b4;
double t=m_edit5;

//*****************This if loop control **************************


//***************end of the calculation **************************
//****************************************************************

if ((b4easting>=afeasting)&&(b4easting<=afeasting_end))
if ((b4northing<=afnorthing)&&(b4northing<=afnorthing_end))

64
MSc. Remote Sensing Dissertation September 2002

shift_col_b4=0; // before image easting


shift_row_b4=0; //before image northing

}
else
{

shift_col_b4=0; // before image easting


shift_row_b4=conv_ll_xy(b4northing,afnorthing,b4cellsizey);

}
else
{
if ((b4northing<=afnorthing)&&(b4northing<=afnorthing_end))
{

shift_col_b4=sqrt(pow(conv_ll_xy(b4easting,afeasting, b4cellsizex),2));
shift_row_b4=0; //before image

}
else
{

shift_col_b4=sqrt(pow(conv_ll_xy(b4easting,afeasting, b4cellsizex),2));
shift_row_b4=sqrt(pow(conv_ll_xy(afnorthing,b4northing, b4cellsizey),2));

}
}

//****************************************************************

mid_x=m;
mid_y=m;

balance_x=nr_rows_b4 % mid_x/2;
balance_y=nr_columns_b4 % mid_y/2;
int nr_rows_limit,nr_cols_limit;

//*************This if loop control the specify ******************


//****************windows size to be search and*******************
//***************re duce the size of the memory for fast********
//*********************************data access********************
if (m_checklimit==1)
{

shift_row_limit=(m_startLong-b4easting)/b4cellsizex;
shift_col_limit=(m_startLat-b4northing)/b4cellsizey;
nr_rows_limit=(m_endLong-b4easting)/b4cellsizex;
nr_cols_limit=(m_endLat-b4northing)/b4cellsizey;
}
else
{
nr_rows_limit=nr_rows_b4;
nr_cols_limit=nr_columns_b4;
}

shift_row_b4=shift_row_b4+mid_x;
shift_col_b4=shift_col_b4+mid_y;

part_x=shift_row_b4;
part_y=shift_col_b4;

65
MSc. Remote Sensing Dissertation September 2002

//****************************************************************
int a=0,b=0;
int mark;
double max,x,y;

//*******************Start*of the read of the data image*********


//**********************************and **************************
//****************kept the DN value in the memory for fast********
//*********************************data access********************

int counti=0;

int max_col=(ll_col_end-b4easting)/b4cellsizex;;
int max_row=(ll_row_end-b4northing)/b4cellsizey;;

shift_row_b4=0;
shift_col_b4=0;
shift_row_af=0;
shift_col_af=0;

int m1=n;
int ctr_tt=0;
int ctr_tt_af=0;

int point_row_b4=shift_row_b4;
int point_col_b4=shift_col_b4;
int point_row_af=0,point_col_af=0;
int ctr_a=0 ,ctr_b=0;
int yh=0;
int s_p=0;
int s_p1=0;

{
mode =1; //setting mode kepada target array
int cx = shift_col_b4+nr_columns_b4;//size col bemrhenti
int cy = shift_row_b4+nr_rows_b4;//size row berhenti
read1(p_alg_b4,nr_rows_b4,nr_columns_b4,cx,cy,shift_col_b4, shift_row_b4,mode);

mode=2;
int cx1 = shift_col_af +nr_columns_af;;
int cy1 = shift_row_af +nr_rows_af;;
read1(p_alg_af,nr_rows_af,nr_columns_af,cx1,cy1,shift_col_af, shift_row_af,mode);

ctr_b=ctr_b+m1;
s_p=m/2;
s_p1=m/2;

//****************************************************************
//*******************************Before image*********************
//*************Start* reading of the data image for block target*
//****************************************************************
for (int z_b4=0; (z_b4<(nr_columns_b4*nr_rows_b4)
&&point_col_b4<nr_columns_b4&&point_row_b4+m<=nr_rows_b4); z_b4++)
{

if (point_row_af+m>=nr_rows_af)
break;

ctr_tt=0;
for (int i=0; i<=nr_rows_b4;i++)

66
MSc. Remote Sensing Dissertation September 2002

{
for (int j=0;j<=nr_columns_b4;j++)
{
if ((j>=0+s_p)&&(j<m+s_p)&&(i>0+s_p1)&&(i<=m+s_p1)&&(ctr_tt!=(m*m)))
{

temp_target[ctr_tt]=target[j+((i-1)*nr_columns_b4)];
ctr_tt++;
}

}// for loop j

}// for loop i


point_col_b4=shift_col_b4+s_p;
point_row_b4=shift_row_b4+s_p1;
mode=1;
double Tmn=0;
double Smn=0;
statistic(m,1,mode);

//****************************************************************
//**************************After Image***************************
//*************Start* reading of the data image for block search*
//****************************************************************
mode =2;
shift_col_af=0;
shift_row_af=0;

//*****************************************************************************
//********************setting the actual cordinate for after image************
x=(((point_row_b4-(mid_x/2))*b4cellsizex)+b4easting);
y=(((point_col_b4-(mid_x/2))*b4cellsizey)+b4northing);
shift_row_af=conv_ll_xy(afeasting,x,afcellsizex);
shift_col_af=conv_ll_xy(afnorthing,y,afcellsizey);

//*****************************************************************************
//*****************************************************************************
int s_p_af=shift_col_af;
int s_p1_af=shift_row_af;
int l_c=0,l_r=0;
int jum=0;
int fc=0;
int harap=0;

for (int z_af=0; (z_af<(nr_columns_af*nr_rows_af)


&&point_col_af<nr_columns_af
&&point_row_af+m<=nr_rows_af)
&&(harap<(m1-m+1));
z_af++)
{
ctr_tt=0;
CProgressCtrl* pBar = (CProgressCtrl*) GetDlgItem(IDC_PROGRESS2);
pBar->SetPos(m_nCount * 100/nMaxCount);

for (int i=0; i<=nr_rows_af;i++)


{
for (int j=0;j<=nr_columns_af;j++)
{
if
((j>=0+s_p_af)&&(j<m+s_p_af)&&(i>0+s_p1_af)&&(i<=m+s_p1_af)&&(ctr_tt!=(m*m)))
{

temp_search[ctr_tt]=search[j+((i-1)*nr_columns_b4)];
ctr_tt++;
}

}// for loop j

}// for loop i

67
MSc. Remote Sensing Dissertation September 2002

point_col_af=s_p_af;
point_row_af=s_p1_af;

///////////////////////////////////////////////////////
//////////////////////begin/of/////////////////////////
////////////the acual calculation is here /////////////
///////////////////////////////////////////////////////

if (::PeekMessage(&Message,NULL,0,0,PM_REMOVE))
{

::TranslateMessage(&Message);

::DispatchMessage(&Message);
}

mode=2;
statistic(m,1,mode);
Smn=0;
Tmn=0;
double sumProduct=0;
double co_variance=0;

for (int ctr=0;ctr<m*m;ctr++)


{
Tmn=(temp_target[ctr]);
Smn=(temp_search[ctr]);
sumProduct=sumProduct+((Smn-mean_b4)*(Tmn-mean_af))
}//for loop ctr

co_variance=sumProduct;//(m*m)-mean_af*mean_b4;

cross_cof[jum].cross_cof =((co_variance)/(sqrt(variance_b4)*sqrt(variance_af)));
cross_cof[jum].easting_af =point_col_af;
cross_cof[jum].northing_af =point_row_af;
cross_cof[jum].easting_b4 = point_col_b4;
cross_cof[jum].northing_b4 =point_row_b4;

double te=cross_cof[jum].cross_cof;
te=sqrt(pow(te,2));

if (te==1)
{
mark=jum;
}

jum++;
///////////////////////////////////////////////////////
//////////////////////end/of///////////////////////////
////////////the acual calculation is here /////////////
///////////////////////////////////////////////////////
///////////////////////////////////////////////////////
///////////////////////////////////////////////////////
s_p_af=s_p_af+1;

if /**/ ((s_p_af>(m1-m+1+shift_col_af))||(s_p_af==nr_columns_af)||((nr_columns_af-s_p_af)<m))
{
s_p1_af++;
s_p_af=shift_col_af;
harap++;
}
}// for loop z.....

int hg=0;
mark=-2;
max=0;
for (hg=1;hg <(m*m-1);hg++)
{double semen=sqrt(pow(cross_cof[hg].cross_cof,2));
if (semen<=1.0000009)
{

68
MSc. Remote Sensing Dissertation September 2002

if ((semen>=max)&&(cross_cof[hg].cross_cof>=-1)
&&(cross_cof[hg].cross_cof<=1.1)
&&(semen>=m_edit5))
{
max=sqrt(pow(cross_cof[hg].cross_cof,2));
mark =hg;
}
if (semen==1)
{
mark=hg;
break;
}
}
else
{

}
}
if (mark>-1)
{
EQ.cross_cof=cross_cof[mark].cross_cof;
EQ.easting_b4=(cross_cof[mark].easting_b4*b4cellsizex)+b4easting;
EQ.easting_af=(cross_cof[mark].easting_af*afcellsizex)+afeasting;
EQ.northing_b4=(cross_cof[mark].northing_b4*b4cellsizey)+b4northing;
EQ.northing_af=(cross_cof[mark].northing_af*afcellsizey)+afnorthing;
EQ.shift_x =EQ.easting_b4-EQ.easting_af;
EQ.shift_y =EQ.northing_b4-EQ.northing_af;

if ((EQ.shift_x==0)&&(EQ.shift_y==0))
fprintf(outputfile1,"point(,%lf,%lf,-1,-1,-1,0).\n",
(EQ.easting_b4+(2*b4cellsizex)+b4cellsizex/2),(EQ.northing_b4+(2*b4cellsizey)+b4cellsizey/2));
else if ((EQ.shift_x<b4cellsizex*20)&&(EQ.shift_y<b4cellsizey*20*-1))

fprintf(outputfile1,"poly(,2,[%lf,%lf,%lf,%lf],1,2,0,0,0,0,255,128,128,0).\n",
(EQ.easting_b4+(2*b4cellsizex)+b4cellsizex/2),(EQ.northing_b4+(2*b4cellsizey)+b4cellsizey/2),
(EQ.easting_af+(2*afcellsizex)+afcellsizex/2),(EQ.northing_af+(2*afcellsizey)+afcellsizey/2));

fprintf(testfile, " %lf\t\t\t%lf\t\t%lf\t\t%lf\t\t%lf\n",


EQ.easting_b4,EQ.northing_b4,EQ.easting_af,EQ.northing_af,EQ.cross_cof );

deltaX[0]=EQ.shift_x;
deltaY[0]=EQ.shift_y;

if (EQ.cross_cof >m_edit5)
{

cof[0]= (EQ.cross_cof*100)+128;
}
else
{

cof[0]= (EQ.cross_cof*100)+128;
}
myFile.GetPosition();
myFile.Write(deltaX,1);
myFileY.GetPosition();
myFileY.Write(deltaY,1);
myFileY.GetPosition();
myFileCof.Write(cof,1);
}
else
{

deltaX[0]=NULL;
deltaY[0]=NULL;
cof[0]=NULL;

myFile.GetPosition();
myFile.Write(deltaX,1);
myFileY.GetPosition();
myFileY.Write(deltaY,1);
myFileY.GetPosition();
myFileCof.Write(cof,1);

69
MSc. Remote Sensing Dissertation September 2002

}
if (::PeekMessage(&Message,NULL,0,0,PM_REMOVE))
{

::TranslateMessage(&Message);

::DispatchMessage(&Message);
}

nr_x++;
s_p=s_p+SHIFTED;
if ((s_p==nr_columns_b4)||((nr_columns_b4-s_p)<m))
{
s_p=0;
nr_y++;
s_p1=s_p1+SHIFTED;
}
malloc_or_die(1);
m_nCount=(point_row_b4*1000/(nr_rows_b4));
}// for loop z

}//for loop ....shift_col_b4//for loop ...for before image row...

shift_row_b4+=m1;

}// loop for before image col...

fclose(outputfile1);
fclose(testfile);

nr_x=nr_x/nr_y;
nr_x--;
nr_y--;

FILE *headerdeltax; // a pointer to the output file


headerdeltax = fopen(m_delta, "w"); // open text2.txt
fprintf(headerdeltax, "DatasetHeader Begin \n");
fprintf(headerdeltax, "\tVersion = \"6.2\" \n ");
fprintf(headerdeltax, "\tName = \"%s\" \n ",m_delta);
fprintf(headerdeltax, "\tLastUpdated = Sun Jul 21 17:38:54 GMT 2002\n");
fprintf(headerdeltax, "\tDataSetType = ERStorage \n");
fprintf(headerdeltax, "\tDataType = Raster \n");
fprintf(headerdeltax, "\tByteOrder = %s \n",b_order );
fprintf(headerdeltax, "\tCoordinateSpace Begin\n");
fprintf(headerdeltax, "\t\t Datum = \"%s\"\n",datum);
fprintf(headerdeltax, "\t\t Projection= \"%s\"\n",projection);
fprintf(headerdeltax, "\t\t CoordinateType = EN \n");
fprintf(headerdeltax, "\t\t Rotation = 0:0:0.0\n");
fprintf(headerdeltax, "\tCoordinateSpace End\n");
fprintf(headerdeltax, "\tRasterInfo Begin\n");
fprintf(headerdeltax, "\t\t CellType = %s \n",CellTySign);
fprintf(headerdeltax, "\t\t NullCellValue = 0\n");
fprintf(headerdeltax, "\t\t CellInfo Begin\n");
fprintf(headerdeltax, "\t\t\t Xdimension = %d\n", b4cellsizex);
fprintf(headerdeltax, "\t\t\t Ydimension = %d\n", b4cellsizey*-1);
fprintf(headerdeltax, "\t\t CellInfo End\n");
fprintf(headerdeltax, "\t\t NrOfLines = %d\n",nr_x);
fprintf(headerdeltax, "\t\t NrOfCellsPerLine = %d \n",nr_y);
fprintf(headerdeltax, "\t\t RegistrationCoord Begin \n");
fprintf(headerdeltax, "\t\t\t\t Eastings = %lf \n",b4easting);
fprintf(headerdeltax, "\t\t\t\t Northings = %lf \n",b4northing);
fprintf(headerdeltax, "\t\t RegistrationCoord End \n");
fprintf(headerdeltax, "\t\t NrOfBands = 1 \n");
fprintf(headerdeltax, "\t\t BandId Begin \n");
fprintf(headerdeltax, "\t\t\t\tValue = \" Delta X \" \n");
fprintf(headerdeltax, "\t\t BandId End\n");
fprintf(headerdeltax, "\t\t\tRegionInfo Begin\n");
fprintf(headerdeltax, "\t\t\t\tType = Polygon \n");
fprintf(headerdeltax, "\t\t\t\tRegionName = \"All\"\n ");
fprintf(headerdeltax, "\t\t\t\tSourceDataset = \"%s\"\n",m_delta1);
fprintf(headerdeltax, "\t\t\t\tRGBcolour Begin\n");
fprintf(headerdeltax, "\t\t\t\t\tRed = 65535\n");
fprintf(headerdeltax, "\t\t\t\t\tGreen = 65535\n");

70
MSc. Remote Sensing Dissertation September 2002

fprintf(headerdeltax, "\t\t\t\t\tBlue = 65535\n");


fprintf(headerdeltax, "\t\t\t\tRGBcolour End\n");
fprintf(headerdeltax, "\t\t\tRegionInfo End\n");
fprintf(headerdeltax, "\tRasterInfo End\n");
fprintf(headerdeltax, "DatasetHeader End\n");
fclose(headerdeltax);

FILE *headercoff; // a pointer to the output file


headercoff = fopen(m_coffiecient, "w"); // open text2.txt
fprintf(headercoff, "DatasetHeader Begin \n");
fprintf(headercoff, "\tVersion = \"6.2\" \n ");
fprintf(headercoff, "\tName = \"%s\" \n ",m_coffiecient);
fprintf(headercoff, "\tLastUpdated = Sun Jul 21 17:38:54 GMT 2002\n");
fprintf(headercoff, "\tDataSetType = ERStorage \n");
fprintf(headercoff, "\tDataType = Raster \n");
fprintf(headercoff, "\tByteOrder = %s \n",b_order );
fprintf(headercoff, "\tCoordinateSpace Begin\n");
fprintf(headercoff, "\t\t Datum = \"%s\"\n",datum);
fprintf(headercoff, "\t\t Projection= \"%s\"\n",projection);
fprintf(headercoff, "\t\t CoordinateType = EN \n");
fprintf(headercoff, "\t\t Rotation = 0:0:0.0\n");
fprintf(headercoff, "\tCoordinateSpace End\n");
fprintf(headercoff, "\tRasterInfo Begin\n");
fprintf(headercoff, "\t\t CellType = %s \n",CellTy);
fprintf(headercoff, "\t\t NullCellValue = 0\n");
fprintf(headercoff, "\t\t CellInfo Begin\n");
fprintf(headercoff, "\t\t\t Xdimension = %d\n", b4cellsizex);
fprintf(headercoff, "\t\t\t Ydimension = %d\n", b4cellsizey*-1);
fprintf(headercoff, "\t\t CellInfo End\n");
fprintf(headercoff, "\t\t NrOfLines = %d\n",nr_x);
fprintf(headercoff, "\t\t NrOfCellsPerLine = %d \n",nr_y);
fprintf(headercoff, "\t\t RegistrationCoord Begin \n");
fprintf(headercoff, "\t\t\t\t Eastings = %lf \n",b4easting);
fprintf(headercoff, "\t\t\t\t Northings = %lf \n",b4northing);
fprintf(headercoff, "\t\t RegistrationCoord End \n");
fprintf(headercoff, "\t\t NrOfBands = 1 \n");
fprintf(headercoff, "\t\t BandId Begin \n");
fprintf(headercoff, "\t\t\t\tValue = \"Coffiecient \" \n");
fprintf(headercoff, "\t\t BandId End\n");
fprintf(headercoff, "\t\t\tRegionInfo Begin\n");
fprintf(headercoff, "\t\t\t\tType = Polygon \n");
fprintf(headercoff, "\t\t\t\tRegionName = \"All\"\n ");
fprintf(headercoff, "\t\t\t\tSourceDataset = \"%s\"\n",m_coff1);
fprintf(headercoff, "\t\t\t\tRGBcolour Begin\n");
fprintf(headercoff, "\t\t\t\t\tRed = 65535\n");
fprintf(headercoff, "\t\t\t\t\tGreen = 65535\n");
fprintf(headercoff, "\t\t\t\t\tBlue = 65535\n");
fprintf(headercoff, "\t\t\t\tRGBcolour End\n");
fprintf(headercoff, "\t\t\tRegionInfo End\n");
fprintf(headercoff, "\tRasterInfo End\n");
fprintf(headercoff, "DatasetHeader End\n");
fclose(headercoff);

FILE *headerdeltay; // a pointer to the output file


headerdeltay = fopen(m_deltaY, "w"); // open text2.txt
fprintf(headerdeltay, "DatasetHeader Begin \n");
fprintf(headerdeltay, "\tVersion = \"6.2\" \n ");
fprintf(headerdeltay, "\tName = \"%s\" \n ",m_deltaY);
fprintf(headerdeltay, "\tLastUpdated = Sun Jul 21 17:38:54 GMT 2002\n");
fprintf(headerdeltay, "\tDataSetType = ERStorage \n");
fprintf(headerdeltay, "\tDataType = Raster \n");
fprintf(headerdeltay, "\tByteOrder = %s \n",b_order );
fprintf(headerdeltay, "\tCoordinateSpace Begin\n");
fprintf(headerdeltay, "\t\t Datum = \"%s\"\n",datum);
fprintf(headerdeltay, "\t\t Projection= \"%s\"\n",projection);
fprintf(headerdeltay, "\t\t CoordinateType = EN \n");
fprintf(headerdeltay, "\t\t Rotation = 0:0:0.0\n");
fprintf(headerdeltay, "\tCoordinateSpace End\n");
fprintf(headerdeltay, "\tRasterInfo Begin\n");
fprintf(headerdeltay, "\t\t CellType = %s \n",CellTySign);
fprintf(headerdeltay, "\t\t NullCellValue = 0\n");
fprintf(headerdeltay, "\t\t CellInfo Begin\n");
fprintf(headerdeltay, "\t\t\t Xdimension = %d\n", b4cellsizex);

71
MSc. Remote Sensing Dissertation September 2002

fprintf(headerdeltay, "\t\t\t Ydimension = %d\n", b4cellsizey*-1);


fprintf(headerdeltay, "\t\t CellInfo End\n");
fprintf(headerdeltay, "\t\t NrOfLines = %d\n",nr_x);
fprintf(headerdeltay, "\t\t NrOfCellsPerLine = %d \n",nr_y);
fprintf(headerdeltay, "\t\t RegistrationCoord Begin \n");
fprintf(headerdeltay, "\t\t\t\t Eastings = %lf \n",b4easting);
fprintf(headerdeltay, "\t\t\t\t Northings = %lf \n",b4northing);
fprintf(headerdeltay, "\t\t RegistrationCoord End \n");
fprintf(headerdeltay, "\t\t NrOfBands = 1 \n");
fprintf(headerdeltay, "\t\t BandId Begin \n");
fprintf(headerdeltay, "\t\t\t\tValue = \"Delta Y \" \n");
fprintf(headerdeltay, "\t\t BandId End\n");
fprintf(headerdeltay, "\t\t\tRegionInfo Begin\n");
fprintf(headerdeltay, "\t\t\t\tType = Polygon \n");
fprintf(headerdeltay, "\t\t\t\tRegionName = \"All\"\n ");
fprintf(headerdeltay, "\t\t\t\tSourceDataset = \"%s\"\n",m_deltaY1);
fprintf(headerdeltay, "\t\t\t\tRGBcolour Begin\n");
fprintf(headerdeltay, "\t\t\t\t\tRed = 65535\n");
fprintf(headerdeltay, "\t\t\t\t\tGreen = 65535\n");
fprintf(headerdeltay, "\t\t\t\t\tBlue = 65535\n");
fprintf(headerdeltay, "\t\t\t\tRGBcolour End\n");
fprintf(headerdeltay, "\t\t\tRegionInfo End\n");
fprintf(headerdeltay, "\tRasterInfo End\n");
fprintf(headerdeltay, "DatasetHeader End\n");
fclose(headerdeltay);

//****************************************************************
//****************************************************************
//****************************************************************

CProgressCtrl* pBar = (CProgressCtrl*) GetDlgItem(IDC_PROGRESS2);


pBar->SetPos(1000);
GetDlgItem(IDOK2)->EnableWindow(TRUE);
GetDlgItem(ID_BROWSE1)->EnableWindow(TRUE);
GetDlgItem(IDC_SAVE)->EnableWindow(TRUE);

GetDlgItem(ID_BROWSE)->EnableWindow(TRUE);

GetDlgItem(IDC_EDIT1)->EnableWindow(TRUE);
GetDlgItem(IDC_EDIT2)->EnableWindow(TRUE);
GetDlgItem(IDC_EDIT3)->EnableWindow(TRUE);
GetDlgItem(IDC_EDIT4)->EnableWindow(TRUE);
GetDlgItem(IDC_EDIT5)->EnableWindow(TRUE);
GetDlgItem(IDC_EDIT6)->EnableWindow(TRUE);
GetDlgItem(IDC_EDIT11)->EnableWindow(TRUE);
GetDlgItem(IDC_EDIT12)->EnableWindow(TRUE);
GetDlgItem(IDC_EDIT14)->EnableWindow(TRUE);

UpdateData();
m_edit1="";
m_edit2="";
m_save="";
m_coffiecient="";
m_delta="";
m_deltaY="";
m_startLong=0;
m_startLat=0;
m_endLat=0;
m_endLong=0;

UpdateData(FALSE);
MessageBox("Processing Complete!","FINISH",MB_OK);

}//else if it is not empty in m_edit1 or 2 or 3

void read_af_array(int shift_row_af, int shift_col_af)


{
int m=100;

72
MSc. Remote Sensing Dissertation September 2002

mode =2; //setting mode kepada target array


int cx1 = shift_col_af+m;//size col bemrhenti
int cy1 = shift_row_af+m;//size row berhenti
read1(p_alg_af,nr_rows_af,nr_columns_af,cx1,cy1,shift_col_af, shift_row_af,mode);

void CTaskingView::OnChangeEdit3()
{
// TODO: If this is a RICHEDIT control, the control will not
// send this notification unless you override the CFormView::OnInitDialog()
// function and call CRichEditCtrl().SetEventMask()
// with the ENM_CHANGE flag ORed into the mask.

// TODO: Add your control notification handler code here


UpdateData();
}

void CTaskingView::OnChangeEdit2()
{
// TODO: If this is a RICHEDIT control, the control will not
// send this notification unless you override the CFormView::OnInitDialog()
// function and call CRichEditCtrl().SetEventMask()
// with the ENM_CHANGE flag ORed into the mask.

// TODO: Add your control notification handler code here


UpdateData();
}

void CTaskingView::OnChangeEdit4()
{
// TODO: If this is a RICHEDIT control, the control will not
// send this notification unless you override the CFormView::OnInitDialog()
// function and call CRichEditCtrl().SetEventMask()
// with the ENM_CHANGE flag ORed into the mask.

// TODO: Add your control notification handler code here


UpdateData();
}

void CTaskingView::OnChangeEdit5()
{
// TODO: If this is a RICHEDIT control, the control will not
// send this notification unless you override the CFormView::OnInitDialog()
// function and call CRichEditCtrl().SetEventMask()
// with the ENM_CHANGE flag ORed into the mask.

// TODO: Add your control notification handler code here


UpdateData();
}

void CTaskingView::OnChangeEdit6()
{
// TODO: If this is a RICHEDIT control, the control will not
// send this notification unless you override the CFormView::OnInitDialog()
// function and call CRichEditCtrl().SetEventMask()
// with the ENM_CHANGE flag ORed into the mask.

// TODO: Add your control notification handler code here


UpdateData();
}

void CTaskingView::OnChangeEdit1()
{
// TODO: If this is a RICHEDIT control, the control will not
// send this notification unless you override the CFormView::OnInitDialog()
// function and call CRichEditCtrl().SetEventMask()
// with the ENM_CHANGE flag ORed into the mask.

// TODO: Add your control notification handler code here


UpdateData();
}

void CTaskingView::OnDeltaSave()
{

73
MSc. Remote Sensing Dissertation September 2002

// TODO: Add your control notification handler code here


// TODO: Add your control notification handler code here
UpdateData();
CFileDialog dlg(FALSE, "ers","*.ers");
if (dlg.DoModal() ==IDOK){

//dirsave=dlg.GetFileExt();

m_delta=dlg.GetPathName();
int x=m_delta.GetLength();
m_delta1=m_delta.Left(x-4);

CFile file;
}
UpdateData(FALSE);

void CTaskingView::OnSave3()
{
// TODO: Add your control notification handler code here
// TODO: Add your control notification handler code here
UpdateData();
CFileDialog dlg(FALSE, "ers","*.ers");
if (dlg.DoModal() ==IDOK){

//dirsave=dlg.GetFileExt();

m_deltaY=dlg.GetPathName();
int x=m_deltaY.GetLength();
m_deltaY1=m_deltaY.Left(x-4);

CFile file;
}
UpdateData(FALSE);
}

void CTaskingView::OnChangeEdit7()
{
// TODO: If this is a RICHEDIT control, the control will not
// send this notification unless you override the CFormView::OnInitDialog()
// function and call CRichEditCtrl().SetEventMask()
// with the ENM_CHANGE flag ORed into the mask.

// TODO: Add your control notification handler code here

void CTaskingView::OnChangeEdit8()
{
UpdateData(); // send this notification unless you override the CFormView::OnInitDialog()
// function and call CRichEditCtrl().SetEventMask()
// with the ENM_CHANGE flag ORed into the mask.

// TODO: Add your control notification handler code here

void CTaskingView::OnChangeEdit9()
{
UpdateData(); // send this notification unless you override the CFormView::OnInitDialog()
// function and call CRichEditCtrl().SetEventMask()
// with the ENM_CHANGE flag ORed into the mask.

// TODO: Add your control notification handler code here

void CTaskingView::OnChangeEdit10()
{
UpdateData(); // send this notification unless you override the CFormView::OnInitDialog()
// function and call CRichEditCtrl().SetEventMask()
// with the ENM_CHANGE flag ORed into the mask.

74
MSc. Remote Sensing Dissertation September 2002

// TODO: Add your control notification handler code here

void CTaskingView::OnChangeEdit12()
{
UpdateData();
// send this notification unless you override the CFormView::OnInitDialog()
// function and call CRichEditCtrl().SetEventMask()
// with the ENM_CHANGE flag ORed into the mask.

// TODO: Add your control notification handler code here

void CTaskingView::OnChangeEdit11()
{
UpdateData(); // send this notification unless you override the CFormView::OnInitDialog()
// function and call CRichEditCtrl().SetEventMask()
// with the ENM_CHANGE flag ORed into the mask.

// TODO: Add your control notification handler code here

void CTaskingView::OnCoff()
{
// TODO: Add your control notification handler code here
UpdateData();
CFileDialog dlg(FALSE, "ers","*.ers");
if (dlg.DoModal() ==IDOK){

//dirsave=dlg.GetFileExt();

m_coffiecient=dlg.GetPathName();
int x=m_coffiecient.GetLength();
m_coff1=m_coffiecient.Left(x-4);

CFile file;
}
UpdateData(FALSE);
}

void CTaskingView::OnChangeShift()
{
// TODO: If this is a RICHEDIT control, the control will not
// send this notification unless you override the CFormView::OnInitDialog()
// function and call CRichEditCtrl().SetEventMask()
// with the ENM_CHANGE flag ORed into the mask.

// TODO: Add your control notification handler code here


UpdateData(TRUE);
}

75
MSc. Remote Sensing Dissertation September 2002

76

También podría gustarte