|Table of Contents|

Least squares multi-label feature selection algorithmwith label information(PDF)

《南京理工大学学报》(自然科学版)[ISSN:1005-9830/CN:32-1397/N]

Issue:
2019年04期
Page:
423-431
Research Field:
Publishing date:

Info

Title:
Least squares multi-label feature selection algorithmwith label information
Author(s):
Chen HongMa YingcangYang XiaofeiXu Qiuxia
School of Sciences,Xi’an Polytechnic University,Xi’an 710600,China
Keywords:
least squares sparse regularization multi label feature selection
PACS:
TP18
DOI:
10.14177/j.cnki.32-1397n.2019.43.04.007
Abstract:
In order to better reflect the importance of label information,based on the traditional least squares regression model,a least squares regression model containing label information was constructed to solve the multi-label feature selection problem. A slack variable ω was added to the labels one by one,so that the different classes of regression targets were moved in the opposite direction. The distances are expanded between the classes. Furthermore,combining 2,1 norms,a least squares multi-label feature selection with label information model and algorithm were proposed. Finally,the convergence of the algorithm is proved and the efficiency of the algorithm is proved by experiments.

References:

[1] Kong Xiangnan,Yu P S. Gmlc:a multi-label feature selection framework for graph classification[J]. Knowledge & Information Systems,2012,31(2):281-305.
[2]Guyon I,Elisseefi A,et al. An introduction to variable and feature selection[J]. Journal of Machine Learning Research,2003,3(6):1157-1182.
[3]Zhou Luping,Wang Lei,Shen Chunhua. Feature selection with redundancy-constrained class separability[J]. IEEE Transactions on Neural Networks,2010,21(5):853-858.
[4]Zhao Zheng,Liu Huan. Semi-supervised feature selection via spectral analysis[C]//Proceedings of the 2007 SIAM International Conference on Data Mining. Minneapolis,USA:SIAM,2007:641-646.
[5]Kallakech M,Biela P,Macaire L,et al. Constraint scores for semi-supervised feature selection:a comparative study[J]. Pattern Recognition Letters,2011,32(5):656-665.
[6]刘涛,吴功宜,陈正. 一种高效的用于文本聚类的无监督特征选择算法[J]. 计算机研究与发展,2005,43(3):381-386.
Liu Tao,Wu Gongyi,Chen Zheng. An effective unsupervised feature selection method for text clustering[J]. Journal of Computer Research and Development,2005,43(3):381-386.
[7]He Xiaofei,Cai Deng,Niyogi P.“Laplacian score for feature selection”in advances in neural information processing systems[M]. MA:MIT Press,2006:507-514.
[8]Sun Liang,Ji Shuiwang,Ye Jieping. Hypergraph spectral learning for multi-label classification[C]//Proceedings of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York,USA:ACM,2008:668-676.
[9]Strutz T. Data fitting and uncertainty:a practical introduction to weighted least squares and beyond[M]. Wiesbaden:Vieweg and Teubner,2010.
[10]Weinberger K Q,Saul L K. Distance metric learning for large margin nearest neighbor classification[M]. MA:MIT Press,2006:1473-1480.
[11]Schneider P,Bunte K,Stiekema H,et al. Regularization in matrix relevance learning[J]. IEEE Transactions on Neural Networks,2010,21(5):831-840.
[12]Leski J. Ho-kashyap classifier with generalization control[J]. Pattern Recognition Letters,2003,24(14):2281-2290.
[13]Xiang Shiming,Nie Feiping,Meng Gaofeng,et al. Discriminative least squares regression for multiclass classification and feature selection[J]. IEEE Transactions on Neural Networks & Learning Systems,2012,23(11):1738-1754.
[14]Zheng Weishi,Wang Liang,Tan Tieniu,et al. L2,1 regularized correntropy for robust feature selection[C]//IEEE Conference on Computer Vision and Pattern Recognition. Providence,USA:IEEE Press,2012:2504-2511.
[15]Yang Yi,Shen Hengtao,Ma Zhigang,et al. L2,1-norm regularized discriminative feature selection for unsupervised learning[C]//International Joint Conference on Artificial Intelligence. Menlo Park,USA:AAAI Press,2011:1589-1594.
[16]Nie Feiping,Huang Heng,Cai Xiao,et al. Efficient and robust feature selection via joint L2,1-norms minimization[C]//International Conference on Neural Information Processing Systems. Sydney,Australia:Curran Associates Inc,2010:1813-1821.
[17]时中荣,王胜,刘传才. 基于L2,p矩阵范数稀疏表示的图像分类方法[J]. 南京理工大学学报,2017,41(1):80-89.
Shi Zhongrong,Wang Sheng,Liu Chuancai. Sparse representation via L2,p norm for image classification[J]. Journal of Nanjing University of Science and Technology,2017,41(1):80-89.
[18]Vapnik V. The nature of statistical learning theory[M]. New York:Springer-Verlag,2000.
[19]Chen Lin,Ivor Tsang W,Xu Dong. Laplacian embedded regression for scalable manifold regularization[J]. IEEE Transactions on Neural Networks & Learning Systems,2012,23(6):902-915.
[20]Zhang Minling,Zhou Zhihua. ML-KNN:A lazy learning approach to multi-label learning[J]. Pattern Recognition,2007,40(7):2038-2048.
[21]Boutell M R,Luo Jiebo,Shen Xipeng,et al. Learning multi-label scene classification[J]. Pattern Recognition,2004,37(9):1757-1771.
[22]Elisseeff A,Weston J. A kernel method for multi-labelled classification[C]//International Conference on Neural Information Processing Systems:Natural and Synthetic. Cambridge,UK:MIT Press,2001:681-687.
[23]Dumais S,Platt J,Heckerman D,et al. Inductive learning algorithms and representations for text categorization[J]. Computer Engineering & Design,2006(4):148-155.
[24]Lee J,Kim D W. Feature selection for multi-label classification using multivariate mutual information[J]. Pattern Recognition Letters,2013,34(3):349-357.
[25]Lin Yaojin,Hu Qinghua,Liu Jinghua,et al. Multi-label feature selection based on max-dependency and min-redundancy[J]. Neurocomputing,2015,168(C):92-103.
[26]Lee J,Kim D W. Fast multi-label feature selection based on information-theoretic feature ranking[J]. Pattern Recognition,2015,48(9):2761-2771.
[27]Duda J. Supervised and unsupervised discretization of continuous features[C]//The 12th International Conference on Machine Learning. Sydney,Australia:ICML,1995:194-202.

Memo

Memo:
-
Last Update: 2019-09-30