Change detection techniques try to detect and locate areas which have changed between two or more observations of the same scene. These changes can be of different types, with different origins and of different temporal length. This allows to distinguish different kinds of applications:
From the point of view of the observed phenomena, one can distinguish 2 types of changes whose nature is rather different: the abrupt changes and the progressive changes, which can eventually be periodic. From the data point of view, one can have:
From this classification of the different types of problems, one can infer 4 cases for which one can look for algorithms as a function of the available data:
In this section we discuss about the damage assessment techniques which can be applied when only two
images (before/after) are available.
As it has been shown in recent review works [30, 89, 113, 115], a relatively high number of methods exist, but most of them have been developed for optical and infrared sensors. Only a few recent works on change detection with radar images exist [126, 16, 101, 68, 38, 11, 70]. However, the intrinsic limits of passive sensors, mainly related to their dependence on meteorological and illumination conditions, impose severe constraints for operational applications. The principal difficulties related to change detection are of four types:
The problem of detecting abrupt changes between a pair of images is the following: Let I1,I2 be two images acquired at different dates t1,t2; we aim at producing a thematic map which shows the areas where changes have taken place.
Three main categories of methods exist:
The principle of this approach [34] is two obtain two land-use maps independently for each date and comparing them.
This method consists in producing the change map directly from a joint classification of both images.
The last approach consists in producing an image of change likelihood (by differences, ratios or any other approach) and thresholding it in order to produce the change map.
Because of its simplicity and its low computation overhead, the third strategy is the one which has been chosen for the processing presented here.
The source code for this example can be found in the file
Examples/ChangeDetection/ChangeDetectionFrameworkExample.cxx.
This example illustrates the Change Detector framework implemented in OTB. This framework uses the generic programming approach. All change detection filters are otb::BinaryFunctorNeighborhoodImageFilter s, that is, they are filters taking two images as input and providing one image as output. The change detection computation itself is performed on a the neighborhood of each pixel of the input images.
The first step required to build a change detection filter is to include the header of the parent class.
The change detection operation itself is one of the templates of the change detection filters and takes the form of a function, that is, something accepting the syntax foo(). This can be implemented using classical C/C++ functions, but it is preferable to implement it using C++ functors. These are classical C++ classes which overload the () operator. This allows using them with the same syntax as C/C++ functions.
Since change detectors operate on neighborhoods, the functor call will take 2 arguments which are itk::ConstNeighborhoodIterator s.
The change detector functor is templated over the types of the input iterators and the output result type. The core of the change detection is implemented in the operator() section.
The interest of using functors is that complex operations can be performed using internal protected class methods and that class variables can be used to store information so different pixel locations can access to results of previous computations.
The next step is the definition of the change detector filter. As stated above, this filter will inherit from otb::BinaryFunctorNeighborhoodImageFilter which is templated over the 2 input image types, the output image type and the functor used to perform the change detection operation.
Inside the class only a few typedefs and the constructors and destructors have to be declared.
Pay attention to the fact that no .txx file is needed, since filtering operation is implemented in the otb::BinaryFunctorNeighborhoodImageFilter class. So all the algorithmics part is inside the functor.
We can now write a program using the change detector.
As usual, we start by defining the image types. The internal computations will be performed with floating point precision, while the output image will be stored using one byte per pixel.
We declare the readers, the writer, but also the itk::RescaleIntensityImageFilter which will be used to rescale the result before writing it to a file.
The next step is declaring the filter for the change detection.
We connect the pipeline.
And that is all.
The simplest change detector is based on the pixel-wise differencing of image values:
![]() | (22.1) |
In order to make the algorithm robust to noise, one actually uses local means instead of pixel values.
The source code for this example can be found in the file
Examples/ChangeDetection/DiffChDet.cxx.
This example illustrates the class otb::MeanDifferenceImageFilter for detecting changes between pairs of images. This filter computes the mean intensity in the neighborhood of each pixel of the pair of images to be compared and uses the difference of means as a change indicator. This example will use the images shown in figure 22.1. These correspond to the near infrared band of two Spot acquisitions before and during a flood.
We start by including the corresponding header file.
We start by declaring the types for the two input images, the change image and the image to be stored in a file for visualization.
We can now declare the types for the readers and the writer.
The change detector will give positive and negative values depending on the sign of the difference. We are usually interested only in the asbolute value of the difference. For this purpose, we will use the itk::AbsImageFilter . Also, before saving the image to a file in, for instance, PNG format, we will rescale the results of the change detection in order to use all the output pixel type range of values.
The otb::MeanDifferenceImageFilter is templated over the types of the two input images and the type of the generated change image.
The different elements of the pipeline can now be instantiated.
We set the parameters of the different elements of the pipeline.
The only parameter for this change detector is the radius of the window used for computing the mean of the intensities.
We build the pipeline by plugging all the elements together.
Since the processing time of large images can be long, it is interesting to monitor the evolution of the computation. In order to do so, the change detectors can use the command/observer design pattern. This is easily done by attaching an observer to the filter.
Figure 22.2 shows the result of the change detection by difference of local means.
This detector is similar to the previous one except that it uses a ratio instead of the difference:
![]() | (22.2) |
The use of the ratio makes this detector robust to multiplicative noise which is a good model for the speckle phenomenon which is present in radar images.
In order to have a bounded and normalized detector the following expression is actually used:
![]() | (22.3) |
The source code for this example can be found in the file
Examples/ChangeDetection/RatioChDet.cxx.
This example illustrates the class otb::MeanRatioImageFilter for detecting changes between pairs of images. This filter computes the mean intensity in the neighborhood of each pixel of the pair of images to be compared and uses the ratio of means as a change indicator. This change indicator is then normalized between 0 and 1 by using the classical
![]() | (22.4) |
where μA and μB are the local means. This example will use the images shown in figure 22.3. These correspond to 2 Radarsat fine mode acquisitions before and after a lava flow resulting from a volcanic eruption.
We start by including the corresponding header file.
We start by declaring the types for the two input images, the change image and the image to be stored in a file for visualization.
We can now declare the types for the readers. Since the images can be vey large, we will force the pipeline to use streaming. For this purpose, the file writer will be streamed. This is achieved by using the otb::ImageFileWriter class.
The change detector will give a normalized result between 0 and 1. In order to store the result in PNG format we will rescale the results of the change detection in order to use all the output pixel type range of values.
The otb::MeanRatioImageFilter is templated over the types of the two input images and the type of the generated change image.
The different elements of the pipeline can now be instantiated.
We set the parameters of the different elements of the pipeline.
The only parameter for this change detector is the radius of the window used for computing the mean of the intensities.
We build the pipeline by plugging all the elements together.
Figure 22.4 shows the result of the change detection by ratio of local means.
This detector is similar to the ratio of means detector (seen in the previous section page 993). Nevertheless, instead of the comparison of means, the comparison is performed to the complete distribution of the two Random Variables (RVs) [68].
The detector is based on the Kullback-Leibler distance between probability density functions (pdfs). In the neighborhood of each pixel of the pair of images I1 and I2 to be compared, the distance between local pdfs f1 and f2 of RVs X1 and X2 is evaluated by:
![]() | = K(X1|X2)+K(X2|X1) | (22.5) |
with K(Xj|Xi) | = ∫
Rlog![]() | (22.6) |
![]() | (22.7) |
In eq. (22.7), X stands for the Gaussian pdf which has the same mean and variance as the RV X. The κX;k
coefficients are the cumulants of order k, and Hk(x) are the Chebyshev-Hermite polynomials of order k
(see [70] for deeper explanations).
The source code for this example can be found in the file
Examples/ChangeDetection/KullbackLeiblerDistanceChDet.cxx.
This example illustrates the class otb::KullbackLeiblerDistanceImageFilter for detecting changes between pairs of images. This filter computes the Kullback-Leibler distance between probability density functions (pdfs). In fact, the Kullback-Leibler distance is itself approximated through a cumulant-based expansion, since the pdfs are approximated through an Edgeworth series. The Kullback-Leibler distance is evaluated by:
![]() | (22.8) |
a1 | = c3 −3![]() | = c4 −6![]() ![]() | = c6 −15![]() ![]() ![]() | = α2 +β2c3 | = α3 +3αβ2c4 | = α4 +6α2β2 +3β4c 6 | = α6 +15α4β2 +45α2β4 +15β6α | = ![]() | = ![]() |
The program itself is very similar to the ratio of means detector, implemented in otb::MeanRatioImageFilter , in section 22.3.2. Nevertheless the corresponding header file has to be used instead.
The otb::KullbackLeiblerDistanceImageFilter is templated over the types of the two input images and the type of the generated change image, in a similar way as the otb::MeanRatioImageFilter . It is the only line to be changed from the ratio of means change detection example to perform a change detection through a distance between distributions...
The different elements of the pipeline can now be instantiated. Follow the ratio of means change detector example.
The only parameter for this change detector is the radius of the window used for computing the cumulants.
The pipeline is built by plugging all the elements together.
Figure 22.5 shows the result of the change detection by computing the Kullback-Leibler distance between local pdf through an Edgeworth approximation.
The correlation coefficient measures the likelihood of a linear relationship between two random variables:
![]() | (22.9) |
where I1(i,j) and I2(i,j) are the pixel values of the 2 images and pij is the joint probability density. This is like using a linear model:
![]() | (22.10) |
for which we evaluate the likelihood with pij.
With respect to the difference detector, this one will be robust to illumination changes.
The source code for this example can be found in the file
Examples/ChangeDetection/CorrelChDet.cxx.
This example illustrates the class otb::CorrelationChangeDetector for detecting changes between pairs of images. This filter computes the correlation coefficient in the neighborhood of each pixel of the pair of images to be compared. This example will use the images shown in figure 22.6. These correspond to two ERS acquisitions before and during a flood.
We start by including the corresponding header file.
We start by declaring the types for the two input images, the change image and the image to be stored in a file for visualization.
We can now declare the types for the readers. Since the images can be vey large, we will force the pipeline to use streaming. For this purpose, the file writer will be streamed. This is achieved by using the otb::ImageFileWriter class.
The change detector will give a response which is normalized between 0 and 1. Before saving the image to a file in, for instance, PNG format, we will rescale the results of the change detection in order to use all the output pixel type range of values.
The otb::CorrelationChangeDetector is templated over the types of the two input images and the type of the generated change image.
The different elements of the pipeline can now be instantiated.
We set the parameters of the different elements of the pipeline.
The only parameter for this change detector is the radius of the window used for computing the correlation coefficient.
We build the pipeline by plugging all the elements together.
Since the processing time of large images can be long, it is interesting to monitor the evolution of the computation. In order to do so, the change detectors can use the command/observer design pattern. This is easily done by attaching an observer to the filter.
Figure 22.7 shows the result of the change detection by local correlation.
This technique is an extension of the distance between distributions change detector presented in section 22.4.1. Since this kind of detector is based on cumulants estimations through a sliding window, the idea is just to upgrade the estimation of the cumulants by considering new samples as soon as the sliding window is increasing in size.
Let’s consider the following problem: how to update the moments when a N +1th observation xN+1 is added to a set of observations {x1,x2,…,xN} already considered. The evolution of the central moments may be characterized by:
μ1,[N] | = ![]() | (22.11) | |
μr,[N] | = ![]() ![]() ![]() |
![]() | (22.12) |
It yields a set of images that represent the change measure according to an increasing size of the analysis window.
The source code for this example can be found in the file
Examples/ChangeDetection/KullbackLeiblerProfileChDet.cxx.
This example illustrates the class otb::KullbackLeiblerProfileImageFilter for detecting changes between pairs of images, according to a range of window size. This example is very similar, in its principle, to all of the change detection examples, especially the distance between distributions one (section 22.4.1) which uses a fixed window size.
The main differences are:
Then, the program begins with the otb::VectorImage and the otb::KullbackLeiblerProfileImageFilter header files in addition to those already details in the otb::MeanRatioImageFilter example.
The otb::KullbackLeiblerProfileImageFilter is templated over the types of the two input images and the type of the generated change image (which is now of multi-components), in a similar way as the otb::KullbackLeiblerDistanceImageFilter .
The different elements of the pipeline can now be instantiated in the same way as the ratio of means change detector example.
Two parameters are now required to give the minimum and the maximum size of the analysis window. The program will begin by performing change detection through the smaller window size and then applying moments update of eq. (22.11) by incrementing the radius of the analysis window (i.e. add a ring of width 1 pixel around the current neightborhood shape). The process is applied until the larger window size is reached.
Figure 22.8 shows the result of the change detection by computing the Kullback-Leibler distance between local pdf through an Edgeworth approximation.
The source code for this example can be found in the file
Examples/ChangeDetection/MultivariateAlterationDetector.cxx.
This example illustrates the class otb::MultivariateAlterationChangeDetectorImageFilter , which implements the Multivariate Alteration Change Detector algorithm [99]. This algorihtm allows performing change detection from a pair multi-band images, including images with different number of bands or modalities. Its output is a a multi-band image of change maps, each one being unccorrelated with the remaining. The number of bands of the output image is the minimum number of bands between the two input images.
The algorithm works as follows. It tries to find two linear combinations of bands (one for each input images) which maximize correlation, and subtract these two linear combinitation, leading to the first change map. Then, it looks for a second set of linear combinations which are orthogonal to the first ones, a which maximize correlation, and use it as the second change map. This process is iterated until no more orthogonal linear combinations can be found.
This algorithms has numerous advantages, such as radiometry scaling and shifting invariance and absence of parameters, but it can not be used on a pair of single band images (in this case the output is simply the difference between the two images).
We start by including the corresponding header file.
We then define the types for the input images and for the change image.
We can now declare the types for the reader. Since the images can be vey large, we will force the pipeline to use streaming. For this purpose, the file writer will be streamed. This is achieved by using the otb::ImageFileWriter class.
The different elements of the pipeline can now be instantiated.
We set the parameters of the different elements of the pipeline.
We build the pipeline by plugging all the elements together.
And then we can trigger the pipeline update, as usual.
Figure 22.9 shows the results of Multivariate Alteration Detector applied to a pair of SPOT5 images before and after a flooding event.