﻿ 一种相似子空间嵌入算法
 文章快速检索 高级检索
 山东大学学报(工学版)  2018, Vol. 48 Issue (1): 8-14  DOI: 10.6040/j.issn.1672-3961.0.2017.401 0

### 引用本文

QIAN Wenguang, LI Huimin. A similarity subspace embedding algorithm[J]. Journal of Shandong University (Engineering Science), 2018, 48(1): 8-14. DOI: 10.6040/j.issn.1672-3961.0.2017.401.

### 文章历史

A similarity subspace embedding algorithm
QIAN Wenguang, LI Huimin
School of Computer and Remote Sensing Information Technology, North China Institute of Aerospace Engineering, Langfang 065000, Hebei, China
Abstract: By the analysis of the classical Linear Discriminant Analysis (LDA) and Maximum Margin Criterion (MMC) methods, a supervised dimensionality reduction by in-depth learning within scatters of classes which called Similarity Subspace Embedding (SSE) was proposed. A deep study on the within class scatter matrix was made. The divergences of the subspace of each class were obtained by subspace learning. This approach could get abundant information between class scatter matrixes, and then get a better low dimensional space. Compared with the MMC method, the SSE method was more adequate for the class of data learning, while avoiding the small sample problem of the LDA method. Experimental results on AR face image, Coil data set and handwriting showed that the proposed method had a higher recognition rate compared with other three classic methods, which showed the effectiveness of the proposed method.
Key words: dimensionality reduction    LDA    MMC    scatter matrix    subspace    small-sample-size problem
0 引言

1 SSE方法 1.1 相关的监督降维方法分析

MMC与LDA的想法不同, 但最终的技术类似。两个不同类的距离

 $d\left( {{\mathit{\boldsymbol{c}}_i},{\mathit{\boldsymbol{c}}_j}} \right) = d\left( {{\mathit{\boldsymbol{\mu }}_i},{\mathit{\boldsymbol{\mu }}_j}} \right) - \left( {S\left( {{\mathit{\boldsymbol{c}}_i}} \right) + S\left( {{\mathit{\boldsymbol{c}}_j}} \right)} \right),$ (1)

1.2 SSE算法

 $\mathit{\boldsymbol{S}}_{\rm{w}}^i = \sum\limits_{j = 1}^{{N_i}} {\left( {{\mathit{\boldsymbol{x}}_{ij}} - {\mathit{\boldsymbol{\mu }}_i}} \right){{\left( {{\mathit{\boldsymbol{x}}_{ij}} - {\mathit{\boldsymbol{\mu }}_i}} \right)}^{\rm{T}}}} = \mathit{\boldsymbol{X}}_{\rm{w}}^i{\left( {\mathit{\boldsymbol{X}}_{\rm{w}}^i} \right)^{\rm{T}}},$ (2)

 算法1 相似子空间嵌入SSE算法 输入:原始高维数据$\boldsymbol{X}$; 输出:降维后的最优的低维子空间$\boldsymbol{W}$; $\mathbf{Step}$ 1 初始类内离散度矩阵$\boldsymbol{S}_{\text{w}}$; $\mathbf{Step}$ 2 根据式(2)计算矩阵$\boldsymbol{S}^{i}_{\text{w}}$前$\bar{d}$个最小特征值对应的特征向量$\boldsymbol{U}_{i}$; $\mathbf{Step}$ 3 利用$\boldsymbol{U}_{i}$计算类间离散度$\bar{S}_{\text{b}}$; $\mathbf{Step}$ 4 通过优化问题(3)计算SSE最优的低维子空间$\boldsymbol{W}$。

2 试验结果与分析

2.1 图像识别试验

 图 1 AR数据集试验结果 Figure 1 Experiment results of AR dataset
 图 2 Coil-20数据集试验结果 Figure 2 Experiment results of Coil-20 dataset
 图 3 MNIST数据集实验结果 Figure 3 Experiment results of MNIST dataset

2.2 试验分析

3 结语

 [1] JOLLIFFE I T. Principal component analysis[M]. New York: Springer-Verlag, 1986. [2] COX T, COX M. Multi-dimensional scaling[M]. London: Chapman & Hall, 1994. [3] ROWEIS S T, SAUL L K. Nonlinear dimensionality reduction by locally linear embedding[J]. Science, 2000, 290(5500): 2323-2326 DOI:10.1126/science.290.5500.2323 [4] ZHANG Z, ZHA H. Principal manifolds and nonlinear dimensionality reduction via tangent space alignment[J]. SIAM Journal of Scientific Computing, 2004, 26(1): 313-338 DOI:10.1137/S1064827502419154 [5] HE X F, YAN S C, HU Y X, et al. Face recognition using Laplacian faces[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence (T-PAMI), 2005, 27(3): 328-340 DOI:10.1109/TPAMI.2005.55 [6] HE X F, CAI D, YAN S C, et al. Neighborhood preserving embedding[C]// Proceedings of the 10th IEEE International Conference on Computer Vision (ICCV). Beijing, China: IEEE, 2005: 1208-1213. [7] PANG Y W, ZHANG L, LIU Z K, et al. Neighborhood preserving projections (NPP): A novel linear dimension reduction method[C]// Proceedings of IEEE International Conference on Advances in Intelligent Computing. Hefei, China: ACM, 2005: 117-125. [8] MIN W L, LU K, HE X F. Locality pursuit embedding[J]. Pattern Recognition, 2004, 37(4): 781-788 DOI:10.1016/j.patcog.2003.09.005 [9] SONG Y, CAI W, HUANG H, et al. Large margin local estimate with applications to medical image classification[J]. IEEE Transactions on Medical Imaging, 2015, 34(6): 1362-1377 DOI:10.1109/TMI.2015.2393954 [10] SCHROFF F, KALENICHENKO D, PHILBIN J. Facenet: a unified embedding for face recognition and clustering[C]// Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR). Boston, USA: IEEE, 2015: 815-823. [11] VALSESIA D, COLUCCIA G, BIANCHI T, et al. Compressed fingerprint matching and camera identification via random projections[J]. IEEE Transactions on Information Forensics and Security, 2015, 10(7): 1472-1485 DOI:10.1109/TIFS.2015.2415461 [12] NASSIRTOUSSI A K, AGHABOZORGI S, WAH T Y, et al. Text mining of news-headlines for FOREX market prediction: A multi-layer dimension reduction algorithm with semantics and sentiment[J]. Expert Systems with Applications, 2015, 42(1): 306-324 DOI:10.1016/j.eswa.2014.08.004 [13] BELHUNEUR P N, HESPANHA J, KRIEGMAN D J. Eigenfaces vs. fisherfaces: recognition using class specific linear projection[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence (T-PAMI), 1997, 19(7): 711-720 DOI:10.1109/34.598228 [14] FISHER R A. The use of multiple measurements in taxonomic problems[J]. Annual of Eugenics, 1936, 7(2): 179-188 DOI:10.1111/j.1469-1809.1936.tb02137.x [15] RAO C R. The utilization of multiple measurements in problems of biological classification[J]. Journal of the Royal Statistical Society, 1948, 10(2): 159-203 [16] NA J H, PARK M S, CHOI J Y. Linear boundary discriminant analysis[J]. Pattern Recognition, 2010, 43(3): 929-936 DOI:10.1016/j.patcog.2009.09.015 [17] LOOG M, DUIN R P W, HAEB-UMBACH R. Multi-class linear dimension reduction by weighted pairwise fisher criteria[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence (T-PAMI), 2001, 23(7): 762-766 DOI:10.1109/34.935849 [18] LI M, YUAN B Z. 2D-LDA: a statistical linear discriminant analysis for image matrix[J]. Pattern Recognition Letters, 2005, 26(5): 527-532 DOI:10.1016/j.patrec.2004.09.007 [19] WALD P, KRONMAL R. Discriminant functions when covariance are unequal and sample sizes are moderate[J]. Biometrics, 1977, 33(3): 479-484 DOI:10.2307/2529362 [20] LI H, JIANG T, ZHANG K. Efficient and robust feature extraction by maximum margin criterion[J]. IEEE Transactions on Neural Networks, 2006, 17(1): 157-165 DOI:10.1109/TNN.2005.860852 [21] YUN F, YAN S C, HUANG T S. Correlation metric for generalized feature extraction[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence (T-PAMI), 2008, 30(12): 2229-2235 DOI:10.1109/TPAMI.2008.154 [22] LIU S L, FENG L, QIAO H. Scatter balance: an angle-based supervised dimensionality reduction[J]. IEEE Transactions on Neural Networks and Learning Systems, 2015, 26(2): 277-289 DOI:10.1109/TNNLS.2014.2314698 [23] LIU S L, FENG L, WANG H B, et al. Extend semi-supervised ELM and a frame work[J]. Neural Computing and Applications, 2016, 27(1): 205-213 DOI:10.1007/s00521-014-1713-y