Graph Regularized Nonnegative Matrix Factorization for Hyperspectral Data Unmixing

Reading time: 5 minute
...

📝 Original Info

  • Title: Graph Regularized Nonnegative Matrix Factorization for Hyperspectral Data Unmixing
  • ArXiv ID: 1111.0885
  • Date: 2011-11-04
  • Authors: 원문에 저자 정보가 포함되어 있지 않아 확인할 수 없습니다.

📝 Abstract

Spectral unmixing is an important tool in hyperspectral data analysis for estimating endmembers and abundance fractions in a mixed pixel. This paper examines the applicability of a recently developed algorithm called graph regularized nonnegative matrix factorization (GNMF) for this aim. The proposed approach exploits the intrinsic geometrical structure of the data besides considering positivity and full additivity constraints. Simulated data based on the measured spectral signatures, is used for evaluating the proposed algorithm. Results in terms of abundance angle distance (AAD) and spectral angle distance (SAD) show that this method can effectively unmix hyperspectral data.

💡 Deep Analysis

Figure 1

📄 Full Content

In hyperspectral imagery mixed pixels consist of more than one distinct material as illustrated in Fig. 1. There are two reasons behind existence of mixed pixels. First reason is related to low spatial resolution of hyperspectral sensors. The second reason is combining different materials forming a homogenous mixture that is independent of spatial resolution of the sensors.

Spectral mixture analysis (or spectral unmixing) refers to decomposition of mixed pixels into endmembers and abundance fractions. Endmembers are extracted spectrum of distinct materials and abundance fractions are defined as the proportions of the extracted endmembers in mixed pixels [1].

Mixing can be modeled in two different ways: Linear and Nonlinear. In Linear Mixing Model (LMM) the measured spectrum is a linear combination of endmembers spectra added by observation noise error. Nonlinear mixing model is resulted in the intimate mixture of materials. Most of the unmixing methods are based on LMM [2].

Many methods based on LMM have been proposed in literature. These methods can be categorized as geometrical or statistical based algorithms [3]. Some examples of geometrical based methods are pixel purity index (PPI) [4], N-FINDR [5], vertex component analysis (VCA) [6], convex cone analysis (CCA) [7], nonnegative matrix factorization (NMF) [8]. Statistical methods like Bayesian analysis of spectral mixture data using Markov chain Monte Carlo methods [9] are based on Bayesian framework.

There are also some semi-supervised methods for spectral unmixing. These methods search for the best collection of signatures within a relatively large known spectral library to optimally model each mixed pixel [3]. Sparse regression methods have been used for this purpose in literature [10].

In this paper graph regularized nonnegative matrix factorization (GNMF) has been examined for hyperspectral unmixing. The proposed method has been applied on simulated data that are generated using USGS spectral library [14]. The results show that this method performs better in comparison with NMF.

Section 2 presents a brief description of linear mixing model. In section 3 NMF and GNMF are discussed. Description of the database used for evaluating the proposed algorithm is provided in section 4. The simulation results are also presented in this section. Section 5 summarizes and concludes the paper.

Linear spectral mixture model is a widely used model for spectral unmixing of hyperspectral data that many methods are based on it. Assume that L is the number of spectral bands. The measured spectrum (X) can be expressed by the equation (1).

( 1 ) In this equation S is spectral signature matrix of endmembers, A is abundance fraction matrix and W is an additive noise matrix. Each column of X (x n ) is a linear combination of spectral signatures in S as formulated in (2).

In ( 2) P is the number of endmembers, N is the number of pixels and a is the vector of abundance fractions for nth pixel. There are two physical constraints on abundance fraction values that should be considered: Abundance non-negativity constraint (ANC) and abundance sun-to-one constraint (ASC) [2].

For a given matrix X, NMF finds nonnegative matrix factors U and V such that:

For quantifying the quality of the approximation, cost functions based on Euclidean distance or Kullback-Leibler divergence can be used. Cost function for NMF using Euclidean distance is given in (4). Minimizing this cost function with respect to U and V subjected to U, V 0 ≥ will lead to NMF method [11].

GNMF works based on local invariance assumption that assumes if two points are close in the intrinsic geometry of data, the representation of these points in the new basis are close to each other. GNMF uses geometrical based regularizer to preserve Riemannian structure.

To model the geometric structure of data, consider a nearest neighbor graph on a scatter of a data points. There is an edge between two data points ( x j , xl ) if they are neighbors.

Different neighborhood systems are applicable like 4neighbourhood or 8-neighbourhood systems. Weight Matrix (W) on the graph can be defined using many methods. The simplest one is 0-1 weighting. In this method if two pixels have an edge between them the weight will be 1 ( 1 jl W = ). Other weighting methods include heat kernel, dot product weighting and etc.

Using Euclidean distance, cost function for considering geometrical properties is defined by following equation. In this equation z j is the low dimensional representation of x j (point in original basis). Tr(.) denotes the trace of a matrix. Combining cost function of (5) with original NMF cost function of (4) and minimizing that with respect to U and V will lead to GNMF method. Implemented algorithms for GNMF are available online 1 by the authors of [12]. Using Euclidean distance, the overall cost function is given by the equation ( 7) [12].

First number of endmembers should be determined. Dimension reduction meth

📸 Image Gallery

cover.png

Reference

This content is AI-processed based on open access ArXiv data.

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut