TY - GEN
T1 - The K-LLE algorithm for nonlinear dimensionality ruduction of large-scale hyperspectral data
AU - Hong, Danfeng
AU - Yokoya, Naoto
AU - Zhu, Xiao Xiang
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2016/6/28
Y1 - 2016/6/28
N2 - This work addresses nonlinear dimensionality reduction by means of locally linear embedding (LLE) for large-scale hyperspectral data. The LLE algorithm depends on spectral decomposition to a great extent, resulting in computational complexity and storage-costing while calculating the embedding of the low-dimensional data, particularly for large-scale hyperspectral data. LLE is not applicable to dimensionality reduction of large-scale hyperspectral data using general personal computers. In this paper, we present a novel method named K-LLE which introduces Kmeans clustering into LLE to deal with this issue. We firstly utilize K-cluster centers to represent the manifold structure of data instead of all data points, and next regard the K-cluster centers as a bridge between the manifold structure and all data in order to obtain the low-dimensional representation for each data point without handling the complex spectral decomposition. Finally, classification is explored as a potential application to validate the proposed algorithm. Experimental results on two hyperspectral datasets demonstrate the effectiveness and superiority of the proposed algorithm.
AB - This work addresses nonlinear dimensionality reduction by means of locally linear embedding (LLE) for large-scale hyperspectral data. The LLE algorithm depends on spectral decomposition to a great extent, resulting in computational complexity and storage-costing while calculating the embedding of the low-dimensional data, particularly for large-scale hyperspectral data. LLE is not applicable to dimensionality reduction of large-scale hyperspectral data using general personal computers. In this paper, we present a novel method named K-LLE which introduces Kmeans clustering into LLE to deal with this issue. We firstly utilize K-cluster centers to represent the manifold structure of data instead of all data points, and next regard the K-cluster centers as a bridge between the manifold structure and all data in order to obtain the low-dimensional representation for each data point without handling the complex spectral decomposition. Finally, classification is explored as a potential application to validate the proposed algorithm. Experimental results on two hyperspectral datasets demonstrate the effectiveness and superiority of the proposed algorithm.
KW - Hyperspectral dimensionality reduction
KW - Kmeans clustering
KW - Large-scale
KW - Manifold learning
UR - http://www.scopus.com/inward/record.url?scp=85037545314&partnerID=8YFLogxK
U2 - 10.1109/WHISPERS.2016.8071754
DO - 10.1109/WHISPERS.2016.8071754
M3 - Conference contribution
AN - SCOPUS:85037545314
T3 - Workshop on Hyperspectral Image and Signal Processing, Evolution in Remote Sensing
BT - 2016 8th Workshop on Hyperspectral Image and Signal Processing
PB - IEEE Computer Society
T2 - 8th Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing, WHISPERS 2016
Y2 - 21 August 2016 through 24 August 2016
ER -