Journal of Shandong University(Engineering Science) ›› 2020, Vol. 50 ›› Issue (3): 58-65.doi: 10.6040/j.issn.1672-3961.0.2019.295

• Machine Learning & Data Mining • Previous Articles     Next Articles

Label distribution learning based on kernel extreme learning machine auto-encoder

Yibin WANG1,2(),Tianli LI1,Yusheng CHENG1,2,*(),Kun QIAN1   

  1. 1. School of Computer and Information, Anqing Normal University, Anqing 246133, Anhui, China
    2. The University Key Laboratory of Intelligent Perception and Computing of Anhui Province, Anqing 246133, Anhui, China
  • Received:2019-06-10 Online:2020-06-01 Published:2020-06-16
  • Contact: Yusheng CHENG E-mail:wangyb07@mail.ustc.edu.cn;chengyshaq@163.com
  • Supported by:
    安徽省高校重点自然科学基金资助项目(KJ2017A352);安徽省高校重点实验室基金资助项目(ACAIM160102)

Abstract:

In the label distribution learning framework, the example could be associated with the degree of description of the label. However, most of the algorithms were designed with complete data, and didn′t consider the noise in the data. Therefore, combined the noise reduction characteristics of the auto-encoder and the stability of the kernel extreme learning machine, the Label Distribution Learning Algorithm based on Kernel Extreme Learning Machine with auto-encoder was proposed in this paper. Firstly, we used the auto-encoder in kernel extreme learning machine to map the original feature space to obtain more robust feature representation. Secondly, we constructed the extreme learning machine model that adapted to the label distribution learning as a classifier to improve the classification efficiency and performance. Finally, the experimental results showed the proposed algorithm had certain advantages over other label distribution learning algorithms, and the hypothesis test method further illustrated the effectiveness of the algorithm.

Key words: label distribution learning, Gaussian noise, auto-encoder, kernel extreme learning machine, robustness

CLC Number: 

  • TP181

Fig.1

Label distribution learning in landscape maps"

Fig.2

AKELM-LDL algorithm modeling process"

Table 1

Label distribution learning datasets"

数据集 样本数 特征数 标记数
Yeast-alpha 2 465 24 18
Yeast-cdc 2 465 24 15
Yeast-diau 2 465 24 7
Yeast-heat 2 465 24 6
Yeast-spo 2 465 24 6
Yeast-cold 2 465 24 4
Yeast-dtt 2 465 24 4
Yeast-spo5 2 465 24 3
Yeast-elu 2 465 24 14
Human Gene 30 542 36 68
SDU_3DFE 2 500 243 6
Movie 7 755 1 869 5

Table 2

Evaluation measures for label distribution learning"

评价指标 Chebyshev↓ Clark↓ Canberra↓ Kullback-Leibler↓ Cosine↑ Intersection↑
计算公式 ${\rm{Di}}{{\rm{s}}_1}\left( {D, \hat D} \right) = {\max _j}\left| {{d_j} - {{\hat d}_j}} \right| $ ${\rm{Di}}{{\rm{s}}_2}\left( {D, \hat D} \right) = \sqrt {\sum\limits_{j = 1}^c {\frac{{{{\left( {{d_j} - {{\hat d}_j}} \right)}^2}}}{{{{\left( {{d_j} + {{\hat d}_j}} \right)}^2}}}} } $ ${\rm{Di}}{{\rm{s}}_3}\left( {D, \hat D} \right) = \sum\limits_{j = 1}^c {\frac{{\left| {{d_j} - {{\hat d}_j}} \right|}}{{{d_j} + {{\hat d}_j}}}} $ ${\rm{Di}}{{\rm{s}}_4}\left( {D, \hat D} \right) = \sum\limits_{j = 1}^c {{d_j}\ln \frac{{{d_j}}}{{{{\hat d}_j}}}} $ ${\rm{Si}}{{\rm{m}}_1}\left( {D, \hat D} \right) = \frac{{\sum\limits_{j = 1}^c {{d_j}{{\hat d}_j}} }}{{\sqrt {\sum\limits_{j = 1}^c {{d_j}^2} } \sqrt {\sum\limits_{j = 1}^c {{{\hat d}^2}_j} } }} $ ${\rm{Si}}{{\rm{m}}_2}\left( {D, \hat D} \right) = \sum\limits_{j = 1}^c {\min \left( {{d_j}, {{\hat d}_j}} \right)} $

Table 3

The results in Chebyshev(↓)"

Datasets PT-Bayes AA-KNN AA-BP SA-IIS SA-BFGS AKELM-LDL
Yeast-alpha 0.099 6±0.005 1 0.014 6±0.000 4 0.036 7±0.001 3 0.017 0±0.000 5 0.013 5±0.000 3 0.013 4±0.000 1
Yeast-cdc 0.108 7±0.010 1 0.017 5±0.000 4 0.038 8±0.002 1 0.020 0±0.000 5 0.016 3±0.000 4 0.016 1±0.000 1
Yeast-elu 0.112 1±0.006 1 0.017 6±0.000 4 0.038 9±0.001 8 0.020 3±0.000 4 0.016 3±0.000 4 0.016 2±0.000 1
Yeast-diau 0.157 7±0.006 9 0.039 1±0.001 3 0.050 1±0.002 0 0.041 2±0.000 9 0.036 7±0.001 2 0.036 6±0.001 0
Yeast-heat 0.174 1±0.010 5 0.044 7±0.001 7 0.053 0±0.002 7 0.046 6±0.001 1 0.042 4±0.001 4 0.041 4±0.001 1
Yeast-spo 0.175 4±0.009 0 0.062 9±0.002 7 0.067 1±0.003 5 0.061 3±0.001 5 0.058 2±0.002 7 0.057 4±0.001 5
Yeast-cold 0.183 1±0.013 3 0.055 0±0.001 7 0.059 1±0.002 6 0.056 6±0.001 6 0.051 1±0.002 1 0.051 0±0.001 3
Yeast-dtt 0.181 8±0.013 4 0.039 7±0.001 7 0.043 3±0.001 5 0.043 6±0.001 3 0.035 9±0.001 4 0.035 8±0.001 4
Yeast-spo5 0.201 3±0.013 3 0.097 0±0.005 4 0.094 2±0.003 5 0.095 0±0.002 3 0.091 4±0.002 2 0.089 7±0.004 6
Movie 0.201 4±0.002 8 0.124 0±0.002 6 0.139 1±0.003 4 0.147 3±0.002 0 0.126 4±0.003 3 0.114 4±0.001 5
SDU_3DFE 0.138 6±0.003 7 0.127 6±0.003 5 0.142 9±0.006 4 0.134 4±0.005 0 0.104 5±0.002 5 0.133 9±0.003 2
Human Gene 0.195 1±0.034 0 0.065 2±0.001 5 0.063 3±0.001 8 0.054 0±0.002 8 0.053 8±0.002 6 0.053 7±0.002 0
Average 5.911 6 3.250 0 4.666 6 4.000 0 2.000 0 1.166 6

Table 4

The results in Clark(↓)"

Datasets PT-Bayes AA-KNN AA-BP SA-IIS SA-BFGS AKELM-LDL
Yeast-alpha 1.172 9±0.039 5 0.230 5±0.004 1 0.734 9±0.031 1 0.261 4±0.007 1 0.210 1±0.003 9 0.209 5±0.006 8
Yeast-cdc 1.080 1±0.061 9 0.235 4±0.004 9 0.595 7±0.036 2 0.257 8±0.006 3 0.216 2±0.005 2 0.213 7±0.004 5
Yeast-elu 1.034 3±0.049 6 0.217 1±0.005 7 0.544 9±0.027 9 0.240 4±0.003 5 0.199 2±0.003 5 0.198 5±0.004 2
Yeast-diau 0.753 1±0.033 3 0.211 0±0.006 2 0.275 3±0.010 4 0.221 5±0.005 2 0.199 1±0.006 4 0.198 3±0.005 4
Yeast-heat 0.683 8±0.038 6 0.193 9±0.007 0 0.232 1±0.013 2 0.201 0±0.004 7 0.184 3±0.005 8 0.179 3±0.004 3
Yeast-spo 0.684 3±0.030 2 0.266 8±0.010 3 0.290 7±0.016 4 0.262 4±0.007 4 0.249 5±0.012 5 0.246 2±0.004 6
Yeast-cold 0.495 4±0.035 1 0.149 7±0.004 7 0.160 9±0.006 9 0.153 2±0.004 9 0.139 7±0.006 2 0.139 1±0.003 1
Yeast-dtt 0.498 6±0.035 8 0.107 5±0.005 1 0.118 3±0.004 2 0.117 2±0.003 9 0.097 8±0.003 9 0.097 7±0.003 6
Yeast-spo5 0.418 0±0.027 8 0.195 8±0.011 3 0.189 5±0.007 3 0.191 4±0.005 1 0.184 4±0.003 6 0.181 2±0.010 4
Movie 0.806 5±0.008 6 0.548 8±0.010 2 0.640 5±0.016 5 0.582 4±0.007 6 0.551 9±0.012 9 0.527 2±0.034 8
SDU_3DFE 0.412 5±0.007 0 0.403 1±0.008 7 0.465 4±0.019 9 0.413 8±0.007 2 0.349 7±0.005 9 0.405 2±0.009 2
Human Gene 4.639 3±0.171 6 2.388 0±0.022 9 3.686 5±0.062 9 2.134 9±0.031 6 2.118 9±0.033 0 2.131 4±0.002 0
Average 5.833 3 3.250 0 4.911 6 3.833 3 1.911 6 1.250 0

Table 5

The results in Canberra(↓)"

Datasets PT-Bayes AA-KNN AA-BP SA-IIS SA-BFGS AKELM-LDL
Yeast-alpha 4.193 7±0.149 2 0.753 2±0.014 0 2.420 8±0.095 6 0.861 8±0.023 2 0.682 3±0.014 7 0.680 4±0.024 1
Yeast-cdc 3.541 2±0.225 8 0.712 1±0.014 7 1.798 6±0.108 4 0.783 1±0.015 7 0.647 9±0.017 3 0.640 1±0.017 0
Yeast-elu 3.277 8±0.171 2 0.641 9±0.017 9 1.598 3±0.083 0 0.712 5±0.009 3 0.584 0±0.007 3 0.581 3±0.004 2
Yeast-diau 1.708 6±0.084 0 0.453 9±0.015 1 0.594 4±0.020 7 0.478 7±0.010 1 0.427 0±0.013 5 0.425 7±0.010 8
Yeast-heat 1.444 0±0.082 7 0.390 3±0.013 5 0.467 1±0.023 6 0.404 2±0.009 8 0.367 8±0.010 2 0.357 4±0.008 7
Yeast-spo 1.443 1±0.068 9 0.549 0±0.022 0 0.595 1±0.033 7 0.539 4±0.015 9 0.513 8±0.024 1 0.506 3±0.008 7
Yeast-cold 0.866 9±0.062 0 0.259 3±0.007 2 0.277 3±0.012 3 0.264 9±0.008 5 0.240 6±0.009 3 0.239 7±0.005 4
Yeast-dtt 0.871 9±0.064 9 0.184 7±0.008 1 0.204 1±0.007 6 0.202 7±0.007 2 0.167 9±0.006 6 0.167 8±0.005 3
Yeast-spo5 0.648 8±0.045 0 0.300 3±0.017 0 0.291 1±0.011 2 0.293 9±0.007 7 0.283 0±0.006 4 0.277 9±0.015 2
Movie 1.563 5±0.018 8 1.055 3±0.021 0 1.223 8±0.030 5 1.119 9±0.016 4 1.063 5±0.026 0 0.999 5±0.036 1
SDU_3DFE 0.902 5±0.016 0 0.831 5±0.019 3 0.982 3±0.038 5 0.896 9±0.018 0 0.728 3±0.013 8 0.889 8±0.003 9
HumanGene 33.915 0±1.480 0 16.277 4±0.174 0 25.297 2±0.487 0 14.633 4±0.249 0 14.511 6±0.252 0 14.592 7±0.290 9
Average 5.916 6 3.333 3 4.833 3 3.750 0 1.911 6 1.250 0

Table 6

The results in Kullback-Leibler(↓)"

Datasets PT-Bayes AA-KNN AA-BP SA-IIS SA-BFGS AKELM-LDL
Yeast-alpha 0.277 6±0.023 8 0.006 5±0.000 2 0.087 2±0.947 5 0.008 5±0.000 4 0.005 6±0.000 3 0.005 5±0.000 1
Yeast-cdc 0.283 1±0.042 1 0.008 2±0.000 3 0.067 2±0.009 8 0.009 9±0.000 5 0.007 0±0.000 3 0.006 9±0.000 1
Yeast-elu 0.282 5±0.032 8 0.007 3±0.000 4 0.060 8±0.009 2 0.009 1±0.000 3 0.006 2±0.000 2 0.006 1±0.000 1
Yeast-diau 0.269 8±0.027 6 0.014 9±0.000 9 0.026 4±0.002 4 0.015 7±0.000 6 0.012 9±0.000 9 0.012 8±0.000 1
Yeast-heat 0.268 4±0.034 5 0.014 4±0.001 0 0.021 6±0.002 9 0.015 2±0.000 7 0.012 8±0.000 7 0.012 2±0.000 1
Yeast-spo 0.278 8±0.039 8 0.029 1±0.002 3 0.033 2±0.003 9 0.026 8±0.001 5 0.024 6±0.002 3 0.024 0±0.000 1
Yeast-cold 0.217 4±0.035 6 0.013 9±0.001 1 0.016 2±0.001 6 0.014 6±0.001 0 0.012 2±0.001 2 0.012 1±0.000 1
Yeast-dtt 0.226 4±0.039 3 0.007 4±0.000 8 0.008 9±0.000 6 0.008 0±0.000 6 0.006 2±0.000 6 0.006 1±0.000 1
Yeast-spo5 0.206 6±0.042 7 0.034 7±0.003 9 0.031 2±0.002 3 0.031 4±0.001 5 0.029 3±0.001 2 0.028 6±0.002 9
Movie 0.729 7±0.059 0 0.117 7±0.005 1 0.166 4±0.010 7 0.131 7±0.004 6 0.118 9±0.006 0 0.100 3±0.002 7
SDU_3DFE 0.084 9±0.002 9 0.081 8±0.003 9 0.102 3±0.009 6 0.081 9±0.004 0 0.054 1±0.002 2 0.072 6±0.003 3
Human Gene 1.803 5±0.148 7 0.301 9±0.007 2 0.598 4±0.022 1 0.240 6±0.012 1 0.238 6±0.011 2 0.238 7±0.009 8
Average 6.000 0 3.333 3 4.833 3 3.750 0 1.911 6 1.166 6

Table 7

The results in Cosine(↑)"

Datasets PT-Bayes AA-KNN AA-BP SA-IIS SA-BFGS AKELM-LDL
Yeast-alpha 0.848 5±0.007 0 0.993 6±0.000 2 0.947 5±0.003 2 0.991 4±0.000 4 0.994 6±0.000 3 0.994 7±0.000 1
Yeast-cdc 0.850 8±0.013 0 0.992 1±0.000 3 0.956 4±0.004 5 0.990 2±0.000 4 0.993 3±0.000 3 0.993 4±0.000 1
Yeast-elu 0.853 1±0.009 2 0.992 9±0.000 4 0.959 7±0.003 9 0.991 0±0.000 3 0.994 0±0.002 0 0.994 1±0.000 1
Yeast-diau 0.863 8±0.007 0 0.986 3±0.000 9 0.977 3±0.001 7 0.985 3±0.000 5 0.988 1±0.007 0 0.988 2±0.000 1
Yeast-heat 0.866 8±0.010 3 0.986 3±0.000 9 0.980 3±0.002 0 0.985 4±0.000 6 0.987 8±0.000 6 0.988 4±0.000 1
Yeast-spo 0.861 1±0.009 3 0.972 7±0.002 2 0.969 5±0.003 2 0.974 7±0.001 3 0.977 0±0.002 0 0.977 5±0.000 1
Yeast-cold 0.893 6±0.009 5 0.986 8±0.000 8 0.984 8±0.001 4 0.986 1±0.000 8 0.988 6±0.001 0 0.988 7±0.000 1
Yeast-dtt 0.895 0±0.009 8 0.992 9±0.000 6 0.991 6±0.000 5 0.991 5±0.000 5 0.994 1±0.000 4 0.994 2±0.000 1
Yeast-spo5 0.898 0±0.010 6 0.969 4±0.003 3 0.972 3±0.001 8 0.972 1±0.001 2 0.974 1±0.000 9 0.974 8±0.002 3
Movie 0.849 5±0.002 2 0.922 4±0.003 2 0.902 8±0.004 5 0.908 1±0.003 0 0.923 5±0.003 7 0.934 2±0.001 5
SDU_3DFE 0.917 9±0.002 8 0.920 2±0.003 5 0.903 7±0.006 9 0.920 3±0.003 6 0.947 0±0.002 2 0.929 6±0.002 9
Human Gene 0.457 0±0.047 7 0.767 4±0.004 0 0.687 6±0.009 8 0.831 6±0.005 5 0.833 3±0.005 3 0.832 1±0.004 3
Average 5.911 6 3.500 0 4.833 3 3.750 0 1.833 3 1.166 6

Table 8

The results in Intersetion(↑)"

Datasets PT-Bayes AA-KNN AA-BP SA-IIS SA-BFGS AKELM-LDL
Yeast-alpha 0.772 5±0.007 8 0.958 4±0.000 8 0.874 0±0.004 3 0.951 8±0.001 3 0.962 3±0.000 9 0.962 4±0.001 4
Yeast-cdc 0.771 1±0.014 6 0.953 1±0.001 0 0.886 7±0.006 3 0.947 8±0.000 9 0.957 4±0.001 2 0.957 9±0.001 3
Yeast-elu 0.772 7±0.010 9 0.954 7±0.001 3 0.891 7±0.005 2 0.949 1±0.000 7 0.958 8±0.000 5 0.959 0±0.001 1
Yeast-diau 0.769 0±0.010 8 0.937 0±0.002 2 0.917 7±0.002 8 0.933 1±0.001 3 0.940 8±0.001 9 0.941 0±0.001 5
Yeast-heat 0.770 8±0.012 2 0.935 9±0.002 2 0.923 5±0.003 5 0.933 2±0.001 6 0.939 7±0.001 6 0.941 4±0.001 4
Yeast-spo 0.769 1±0.010 8 0.909 5±0.003 7 0.902 2±0.005 3 0.910 9±0.002 6 0.915 4±0.003 7 0.916 6±0.001 4
Yeast-cold 0.796 5±0.013 9 0.936 0±0.001 6 0.931 7±0.003 1 0.934 4±0.002 0 0.940 7±0.002 1 0.940 9±0.001 4
Yeast-dtt 0.796 4±0.014 7 0.954 4±0.001 8 0.949 6±0.001 9 0.949 6±0.001 8 0.958 6±0.001 6 0.958 6±0.001 2
Yeast-spo5 0.798 7±0.013 3 0.903 0±0.005 4 0.905 8±0.003 5 0.905 0±0.002 3 0.908 6±0.002 2 0.910 3±0.004 6
Movie 0.722 6±0.002 8 0.822 4±0.003 8 0.797 6±0.005 1 0.803 2±0.003 3 0.822 1±0.004 6 0.836 4±0.001 6
SDU_3DFE 0.838 8±0.003 2 0.847 9±0.003 8 0.823 6±0.006 7 0.839 4±0.003 7 0.870 9±0.002 7 0.848 8±0.003 9
Human Gene 0.477 9±0.024 3 0.741 7±0.002 7 0.636 2±0.007 8 0.781 4±0.004 0 0.783 4±0.003 9 0.781 6±0.004 1
Average 5.911 6 3.333 3 4.833 3 3.833 3 1.911 6 1.166 6

Fig.3

Radar graph of the results of six mark distribution learning algorithms"

1 ZHANG Minling , ZHOU Zhihua . A review on multi-label learning algorithms[J]. IEEE Transactions on Knowledge & Data Engineering, 2014, 26 (8): 1819- 1837.
2 GIBAJA E . A tutorial on multilabel learning[J]. ACM Computing Surveys, 2015, 47 (3): 1- 38.
3 GENG Xin . Label distribution learning[J]. IEEE Transactions on Knowledge and Data Engineering, 2016, 28 (7): 1734- 1748.
doi: 10.1109/TKDE.2016.2545658
4 程玉胜, 陈飞, 王一宾. 基于粗糙集的数据流多标记分布特征选择[J]. 计算机应用, 2018, 38 (11): 3105- 3111.
doi: 10.11772/j.issn.1001-9081.2018041275
CHENG Yusheng , CHEN Fei , WANG Yibin . Feature selection for multi-label distribution learning with streaming data based on rough set[J]. Journal of Computer Application, 2018, 38 (11): 3105- 3111.
doi: 10.11772/j.issn.1001-9081.2018041275
5 季荣姿.标记分布学习及其应用[D].南京:东南大学, 2014.
JI Rongzi. Label distribution learning and its applications[D]. Nanjing: Southeast University, 2014.
6 GENG Xin, HOU Peng. Pre-release prediction of crowd opinion on movies by label distribution learning[C]// International Conference on Artificial Intelligence. Buenos Aires, Argentina: AAAI Press, 2015: 3511-3517.
7 GENG Xin, XIA Yu. Head pose estimation based on multivariate label distribution[C]// Computer Vision and Pattern Recognition. Columbus, America: IEEE, 2014: 1837-1842.
8 王一宾, 田文泉, 程玉胜, 等. 基于核极限学习机的标记分布学习[J]. 计算机工程与应用, 2018, 54 (24): 128- 135.
doi: 10.3778/j.issn.1002-8331.1808-0341
WANG Yibing , TIAN Wenquan , CHENG Yusheng , et al. Label distribution learning based on kernel extreme learning machine[J]. Computer Engineering and Applications, 2018, 54 (24): 128- 135.
doi: 10.3778/j.issn.1002-8331.1808-0341
9 GENG Xin , YIN Chao , ZHOU Zhihua . Facial age estimation by learning from label distributions[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2013, 35 (10): 2401- 2412.
doi: 10.1109/TPAMI.2013.51
10 ZHOU Ying, XUE Hui, GENG Xin. Emotion distribution recognition from facial expressions[C]// Proceedings of the 23rd Annual ACM Conference on Multimedia. Brisbane, Australia: IEEE, 2015: 1247-1250.
11 LING Miaogen , GENG Xin . Soft video parsing by label distribution learning[J]. Frontiers of Computer Science, 2019, 13 (2): 302- 317.
doi: 10.1007/s11704-018-8015-y
12 XU Miao, ZHOU Zhihua. Incomplete label distribution learning[C]// Twenty-Sixth International Joint Conference on Artificial Intelligence. Melbourne, Australia: IEEE, 2017: 3175-3181.
13 付文博, 孙涛, 梁藉, 等. 深度学习原理及应用综述[J]. 计算机科学, 2018, 45 (增刊1): 24- 28.
FU Wenbo , SUN Tao , LIANG Ji , et al. Review of principle and application of deep learning[J]. Computer Science, 2018, 45 (Suppl.1): 24- 28.
14 唐朝辉, 朱清新, 洪朝群, 等. 基于自编码器及超图学习的多标签特征提取[J]. 自动化学报, 2016, 42 (7): 1014- 1021.
TANG Chaohui , ZHU Qingxin , HONG Chaoqun , et al. Multi-label feature selection with autoencoders and hypergraph learning[J]. Acta Automatica Sinica, 2016, 42 (7): 1014- 1021.
15 杨文元. 多标记学习自编码网络无监督维数约简[J]. 智能系统学报, 2018, 13 (5): 808- 817.
YANG Wenyuan . Multi-label unsupervised dimensionality reduction via autoencoder networks[J]. CAAI Transa-ctions on Intelligent Systems, 2018, 13 (5): 808- 817.
16 LIAN Siming, LIU Jianwei, LU Runkun, et al. Multi-label learning via supervised autoencoder[C]// Proceedings of the 37th China Control Conference. Wuhan, China: IEEE, 2018.
17 HUANG Guangbin , ZHU Qinyu , SIEW Chee-Kheong . Extreme learning machine: theory and applications[J]. Neurocomputing, 2006, 70 (1-3): 489- 501.
doi: 10.1016/j.neucom.2005.12.126
18 WANG Zhiqiong , XIN Junchang , YANG Hongxu , et al. Distributed and weighted extreme learning machine for imbalanced big data learning[J]. Tsinghua Science and Technology, 2017, 22 (2): 160- 173.
doi: 10.23919/TST.2017.7889638
19 耿新, 徐宁, 邵瑞枫. 面向标记分布学习的标记增强[J]. 计算机研究与发展, 2017, 54 (6): 1171- 1184.
GENG Xin , XU Ning , SHAO Ruifeng . Label enhancement for label distribution learning[J]. Journal of Computer Research and Development, 2017, 54 (6): 1171- 1184.
20 LIN Yaojin , LI Yuwen , WANG Chenxi , et al. Attribute reduction for multi-label learning with fuzzy rough set[J]. Knowledge-Based Systems, 2018, 15 (2): 51- 56.
[1] Run XIANG,Sufen CHEN,Xueqiang ZENG. Facial age estimation based on multivariate multiple regression [J]. Journal of Shandong University(Engineering Science), 2019, 49(2): 54-60.
[2] HOU Mingdong, WANG Yinsong, TIAN Jie. An IMC-PID robust control method for process of integrator plus time delay [J]. JOURNAL OF SHANDONG UNIVERSITY (ENGINEERING SCIENCE), 2016, 46(5): 64-67.
[3] ZHANG Jinggang, MA Wenting, ZHAO Zhicheng. Two-degree-of-freedom Smith predictor control for cascade time delay process [J]. Journal of Shandong University(Engineering Science), 2015, 45(5): 43-50.
[4] ZHAO Zhan-shan1,2, ZHANG Jing3, SUN Lian-kun, DING Gang1. Design of self-adaptive sliding mode controller with finite time convergence [J]. JOURNAL OF SHANDONG UNIVERSITY (ENGINEERING SCIENCE), 2012, 42(4): 74-78.
[5] ZHOU Changhui1, HU Yongjian2, YU Shaopeng1. Design of a robust source scanner identification algorithm [J]. JOURNAL OF SHANDONG UNIVERSITY (ENGINEERING SCIENCE), 2011, 41(2): 62-65.
[6] HUANG Bin. Discrete variable control systems based on a discrete reaching law [J]. JOURNAL OF SHANDONG UNIVERSITY (ENGINEERING SCIENCE), 2011, 41(1): 45-48.
[7] ZHAO Yong-guo,JIA Lei,CAI Wen-jian . A PID tuning method for integrating processes [J]. JOURNAL OF SHANDONG UNIVERSITY (ENGINEERING SCIENCE), 2008, 38(1): 48-51 .
[8] XIE Shu-ying,ZHANG Cheng-jin . An adaptive inverse control scheme of the limited system with input saturation [J]. JOURNAL OF SHANDONG UNIVERSITY (ENGINEERING SCIENCE), 2006, 36(6): 62-66 .
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
[1] ZHANG Yong-hua,WANG An-ling,LIU Fu-ping . The reflected phase angle of low frequent inhomogeneous[J]. JOURNAL OF SHANDONG UNIVERSITY (ENGINEERING SCIENCE), 2006, 36(2): 22 -25 .
[2] KONG Xiang-zhen,LIU Yan-jun,WANG Yong,ZHAO Xiu-hua . Compensation and simulation for the deadband of the pneumatic proportional valve[J]. JOURNAL OF SHANDONG UNIVERSITY (ENGINEERING SCIENCE), 2006, 36(1): 99 -102 .
[3] CHEN Rui, LI Hongwei, TIAN Jing. The relationship between the number of magnetic poles and the bearing capacity of radial magnetic bearing[J]. JOURNAL OF SHANDONG UNIVERSITY (ENGINEERING SCIENCE), 2018, 48(2): 81 -85 .
[4] WANG Bo,WANG Ning-sheng . Automatic generation and combinatory optimization of disassembly sequence for mechanical-electric assembly[J]. JOURNAL OF SHANDONG UNIVERSITY (ENGINEERING SCIENCE), 2006, 36(2): 52 -57 .
[5] LI Ke,LIU Chang-chun,LI Tong-lei . Medical registration approach using improved maximization of mutual information[J]. JOURNAL OF SHANDONG UNIVERSITY (ENGINEERING SCIENCE), 2006, 36(2): 107 -110 .
[6] QIN Tong, SUN Fengrong*, WANG Limei, WANG Qinghao, LI Xincai. 3D surface reconstruction using the shape based interpolation guided by maximal discs[J]. JOURNAL OF SHANDONG UNIVERSITY (ENGINEERING SCIENCE), 2010, 40(3): 1 -5 .
[7] LIU Wen-liang, ZHU Wei-hong, CHEN Di, ZHANG Hong-quan. Detection and tracking of moving targets using the morphology match in radar images[J]. JOURNAL OF SHANDONG UNIVERSITY (ENGINEERING SCIENCE), 2010, 40(3): 31 -36 .
[8] ZHANG Ying,LANG Yongmei,ZHAO Yuxiao,ZHANG Jianda,QIAO Peng,LI Shanping . Research on technique of aerobic granular sludge cultivationby seeding EGSB anaerobic granular sludge[J]. JOURNAL OF SHANDONG UNIVERSITY (ENGINEERING SCIENCE), 2006, 36(4): 56 -59 .
[9] SUN Dianzhu, ZHU Changzhi, LI Yanrui. [J]. JOURNAL OF SHANDONG UNIVERSITY (ENGINEERING SCIENCE), 2009, 39(1): 84 -86 .
[10] HAO Ranhang,CHEN Shouyu . The theory, model and method of water resources evaluationombining quantity with quality[J]. JOURNAL OF SHANDONG UNIVERSITY (ENGINEERING SCIENCE), 2006, 36(3): 46 -50 .