HD-06: Recent Advances in Registration, Integration and Fusion of Remotely Sensed Data

Wojciech Czaja, Jacqueline Le Moigne

Abstract:

In recent years, sophisticated mathematical techniques have been successfully applied to the field of remote sensing to produce significant advances in applications such as registration, integration and fusion of remotely sensed data. Registration, integration and fusion of multiple source imagery are the most important issues when dealing with Earth Science remote sensing data where information from multiple sensors, exhibiting various resolutions, must be integrated. Issues ranging from different sensor geometries, different spectral responses, differing illumination conditions, different seasons, and various amounts of noise need to be dealt with when designing an image registration, integration or fusion method. This tutorial will first define the problems and challenges associated with these applications and then will review some mathematical techniques that have been successfully utilized to solve them. In particular, we will cover topics on geometric multiscale representations, redundant representations and fusion frames, graph operators, diffusion wavelets, as well as spatial-spectral and operator-based data fusion. All the algorithms will be illustrated using remotely sensed data, with an emphasis on current and operational instruments.

Outline:
1. Introduction to registration, integration and fusion of remotely sensed data
a. Overview of Issues and Challenges
Compared to other domains, remote sensing presents specific challenges, such as the variety in acquisition sources and conditions, the size of the data and the lack of known image models and fiducial points (or ground truth). All these challenges will be reviewed using appropriate remote sensing data.

b. Brief survey of registration, integration and fusion methods
In this section, we will briefly survey the most common approaches for registration, integration and fusion, focusing on the most recent advances in those fields.

2. Selected Topics in mathematical techniques recently applied to registration, integration and fusion in the remote sensing domain
a. Geometric multiscale representations
Since the introduction of wavelets, significant attention has been turned into finding new ways to further impact image processing by means of multi-scale like representations. Many generalizations, emphasizing the role of directionality and the need to detect edges, have been proposed. These include, among others, wavelet packets, curvelets, contourlets, wedgelets, and shearlets.

In this module we shall give an overview of these and related techniques, and will present some of their applications in remote sensing with a particular emphasis on dimensionality reduction.

b. Redundant representations and fusion frames
Frame theoretic techniques are relatively new in remote sensing processing. However, they are gaining ground thanks to a direct connection with dictionary learning. One of the main advantages of using overcomplete representation systems is the ease of constructing frames with many built-in features. Fusion frames are an example of such construction, used to integrate information arising from distributed sensor networks.

We shall introduce the fundamentals of frame theory, show examples of how they can be constructed in practical applications from principles and by means of learning.

c. Wavelets and redundant representations for image registration
Utilizing wavelets and redundant representations for image registration not only identifies the type of features to be matched, but also provide the matching process with a multiresolution search framework. This enables to improve the computational speed as well as to increase the final registration accuracy.

We will consider orthogonal wavelets, spline wavelets and a steerable decomposition combined with different similarity metrics and search strategies and will present a comparative analysis of the results using a variety of remotely sensed data, from synthetic datasets to Landsat, MODIS, and EO-1 datasets.

d. Operators on graphs
Sophisticated mathematical techniques for nonlinear dimension reduction have been successfully applied in recent years to analyze remote sensing imagery. Generally speaking, these methods represent high-dimensional data in the form of a graph, with nodes formed by the data points treated as vectors, and with edges that represent the distances between pairs of such vectors. This has led to the analysis of associated kernels, including Laplace and Schroedinger graph operators. They have been employed for dimensionality reduction, segmentation, and data organization and integration.

We shall give a conscious introduction to spectral graph analysis, with an overview of the recent developments in the field of nonlinear dimension reduction and data organization.

e. Data organization by means of diffusion wavelets
Spectral graph theory gave rise to many new intriguing developments in machine learning. Among them, a prominent place belongs to diffusion-based techniques for data organization, segmentation and classification. These techniques were modeled on traditional multiscale representations.

We shall introduce the audience to Diffusion Maps and Diffusion Wavelets, and we will show how these techniques can be used for data self-organization and integration.

f. Fusion using cokriging
Kriging and cokriging are interpolation methods traditionally used in the fields of mining and geostatistics. They represent generalized forms of univariate and multivariate linear regression models over an area; similarly to other interpolation methods, they minimize the variance of the estimation error, with estimates chosen as weighted linear combination of known values, and under the constraint to keep the mean error to be equal to zero. The weights not only depend on the distance but also on the direction and orientation of the known data compared to the interpolated one. Unlike many other image fusion methods, cokriging does not require prior resampling of the various data sets, and is applicable to the vision of future sensor networks, where many small sensors are located at scattered locations.

We will explore using cokriging for pan-sharpening, i.e., improving the spatial resolution of multi-spectral imagery using high-resolution panchromatic data, as well as for performing fusion in the spectral dimension. These methods will be illustrated using Landsat and EO-1 data.

g. Spatial-spectral fusion
Current strategies to integrate spatial and spectral information include the use of wavelet packets, modified distance metrics for graph construction, combining spectral classification with image segmentation results, and adding spatial information in spectral band grouping. Other approaches include utilizing iterative information spreading on the data-dependent graphs with SVM to label unknown classes, using morphological profiles to add spatial information, creating end member sets which are spatially motivated, multilevel segmentation, or combining segmentation with pixel classification maps.

We shall give an overview of some of the recent methods for integration of spectral and spatial information, with an emphasis on techniques arising in harmonic analysis and machine learning.

h. Operator-based data fusion
Data fusion is the process of assembling multiple heterogeneous or homogeneous data sources into a coherent product that improves knowledge compared to a single sensor. In the field of remote sensing, a given platform can contain a combination of Hyperspectral, Synthetic Aperture Radar, LIDAR, panchromatic, multi-angle, and thermal sensors, each with varying capture rates, capture times, and pixel dimensions. It is within the dimension reduction processes that we see opportunities at the graph level, the operator level, and the feature space level, to fuse multiple data sources.

In this module we shall show how the data fusion can be interpreted as a representation problem, through use of the graph-based data-dependent representations. Applications will include HSI-LIDAR and HSI-SAR integration.

Biography:

Wojciech Czaja received M.Sc. in Mathematics from University of Wroclaw, Poland, and Ph.D. in Mathematics from Washington University in St. Louis. He is currently a Professor of Mathematics at University of Maryland at College Park, a member of the Norbert Wiener Center, and a Marie Curie Fellow. Dr. Czaja's publications span from theoretical and applied mathematics, to remote sensing, biomedical imaging, and computational biology. He is a co-author of the books "Integration and Modern Analysis" (Birkhauser, 2009) and "Excursions in Harmonic Analysis" (Birkhauser, 2013). His current research interests include mathematical techniques for analysis and fusion of remote sensing data, as well as applications of computational mathematics to systems biology.

Jacqueline Le Moigne received a B.S. degree in Mathematics, an M.S. degree in Mathematics, and a Ph.D. in Computer Science (specialty: Computer Vision) from the University Pierre and Marie Curie, Paris, France. She is currently the Assistant Chief for Technology in the Software Engineering Division at NASA Goddard Space Flight Center. Her research activities include registration of multi-sensor/multi-scale satellite image data, for which she has been studying wavelets and their implementation for high performance and on-board computing. Dr. Le Moigne has published over 110 journal, conference, and book chapter articles, including 20 journal papers. She is a co-author of an edited book on “Image Registration for Remote Sensing”. She has been an Associate Editor for the IEEE Transactions on Geoscience and Remote Sensing, is a NASA Goddard Senior Fellow, an IEEE Senior Member and a Program Evaluator for the Accreditation Board in Engineering and Technology (ABET). In 2012, she received the NASA Exceptional Service Medal and the Goddard Information Science and Technology Award.