[1]陈 红,马盈仓,杨小飞,等.包含标签信息的最小二乘多标签特征选择算法[J].南京理工大学学报(自然科学版),2019,43(04):423-431.[doi:10.14177/j.cnki.32-1397n.2019.43.04.007]
 Chen Hong,Ma Yingcang,Yang Xiaofei,et al.Least squares multi-label feature selection algorithmwith label information[J].Journal of Nanjing University of Science and Technology,2019,43(04):423-431.[doi:10.14177/j.cnki.32-1397n.2019.43.04.007]
点击复制

包含标签信息的最小二乘多标签特征选择算法()
分享到:

《南京理工大学学报》(自然科学版)[ISSN:1005-9830/CN:32-1397/N]

卷:
43卷
期数:
2019年04期
页码:
423-431
栏目:
出版日期:
2019-08-24

文章信息/Info

Title:
Least squares multi-label feature selection algorithmwith label information
文章编号:
1005-9830(2019)04-0423-09
作者:
陈 红马盈仓杨小飞续秋霞
西安工程大学 理学院,陕西 西安 710600
Author(s):
Chen HongMa YingcangYang XiaofeiXu Qiuxia
School of Sciences,Xi’an Polytechnic University,Xi’an 710600,China
关键词:
最小二乘 稀疏正则化 多标签 特征选择
Keywords:
least squares sparse regularization multi label feature selection
分类号:
TP18
DOI:
10.14177/j.cnki.32-1397n.2019.43.04.007
摘要:
为了更好地体现标签信息的重要性,基于传统的最小二乘回归模型,构造了一种包含标签信息的最小二乘回归模型,用于解决多标签特征选择问题。首先给标签逐一增加一个松弛变量ω,使得不同类别的回归目标沿相反方向移动,从而扩大类别之间的距离然后结合2,1范数,提出了一种包含标签信息的最小二乘多标签特征选择(Least squares multi-label feature selection with label information,LSMFSLI)模型及算法,证明了该算法的收敛性,并通过实验证明了算法的高效性。
Abstract:
In order to better reflect the importance of label information,based on the traditional least squares regression model,a least squares regression model containing label information was constructed to solve the multi-label feature selection problem. A slack variable ω was added to the labels one by one,so that the different classes of regression targets were moved in the opposite direction. The distances are expanded between the classes. Furthermore,combining 2,1 norms,a least squares multi-label feature selection with label information model and algorithm were proposed. Finally,the convergence of the algorithm is proved and the efficiency of the algorithm is proved by experiments.

参考文献/References:

[1] Kong Xiangnan,Yu P S. Gmlc:a multi-label feature selection framework for graph classification[J]. Knowledge & Information Systems,2012,31(2):281-305.
[2]Guyon I,Elisseefi A,et al. An introduction to variable and feature selection[J]. Journal of Machine Learning Research,2003,3(6):1157-1182.
[3]Zhou Luping,Wang Lei,Shen Chunhua. Feature selection with redundancy-constrained class separability[J]. IEEE Transactions on Neural Networks,2010,21(5):853-858.
[4]Zhao Zheng,Liu Huan. Semi-supervised feature selection via spectral analysis[C]//Proceedings of the 2007 SIAM International Conference on Data Mining. Minneapolis,USA:SIAM,2007:641-646.
[5]Kallakech M,Biela P,Macaire L,et al. Constraint scores for semi-supervised feature selection:a comparative study[J]. Pattern Recognition Letters,2011,32(5):656-665.
[6]刘涛,吴功宜,陈正. 一种高效的用于文本聚类的无监督特征选择算法[J]. 计算机研究与发展,2005,43(3):381-386.
Liu Tao,Wu Gongyi,Chen Zheng. An effective unsupervised feature selection method for text clustering[J]. Journal of Computer Research and Development,2005,43(3):381-386.
[7]He Xiaofei,Cai Deng,Niyogi P.“Laplacian score for feature selection”in advances in neural information processing systems[M]. MA:MIT Press,2006:507-514.
[8]Sun Liang,Ji Shuiwang,Ye Jieping. Hypergraph spectral learning for multi-label classification[C]//Proceedings of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York,USA:ACM,2008:668-676.
[9]Strutz T. Data fitting and uncertainty:a practical introduction to weighted least squares and beyond[M]. Wiesbaden:Vieweg and Teubner,2010.
[10]Weinberger K Q,Saul L K. Distance metric learning for large margin nearest neighbor classification[M]. MA:MIT Press,2006:1473-1480.
[11]Schneider P,Bunte K,Stiekema H,et al. Regularization in matrix relevance learning[J]. IEEE Transactions on Neural Networks,2010,21(5):831-840.
[12]Leski J. Ho-kashyap classifier with generalization control[J]. Pattern Recognition Letters,2003,24(14):2281-2290.
[13]Xiang Shiming,Nie Feiping,Meng Gaofeng,et al. Discriminative least squares regression for multiclass classification and feature selection[J]. IEEE Transactions on Neural Networks & Learning Systems,2012,23(11):1738-1754.
[14]Zheng Weishi,Wang Liang,Tan Tieniu,et al. L2,1 regularized correntropy for robust feature selection[C]//IEEE Conference on Computer Vision and Pattern Recognition. Providence,USA:IEEE Press,2012:2504-2511.
[15]Yang Yi,Shen Hengtao,Ma Zhigang,et al. L2,1-norm regularized discriminative feature selection for unsupervised learning[C]//International Joint Conference on Artificial Intelligence. Menlo Park,USA:AAAI Press,2011:1589-1594.
[16]Nie Feiping,Huang Heng,Cai Xiao,et al. Efficient and robust feature selection via joint L2,1-norms minimization[C]//International Conference on Neural Information Processing Systems. Sydney,Australia:Curran Associates Inc,2010:1813-1821.
[17]时中荣,王胜,刘传才. 基于L2,p矩阵范数稀疏表示的图像分类方法[J]. 南京理工大学学报,2017,41(1):80-89.
Shi Zhongrong,Wang Sheng,Liu Chuancai. Sparse representation via L2,p norm for image classification[J]. Journal of Nanjing University of Science and Technology,2017,41(1):80-89.
[18]Vapnik V. The nature of statistical learning theory[M]. New York:Springer-Verlag,2000.
[19]Chen Lin,Ivor Tsang W,Xu Dong. Laplacian embedded regression for scalable manifold regularization[J]. IEEE Transactions on Neural Networks & Learning Systems,2012,23(6):902-915.
[20]Zhang Minling,Zhou Zhihua. ML-KNN:A lazy learning approach to multi-label learning[J]. Pattern Recognition,2007,40(7):2038-2048.
[21]Boutell M R,Luo Jiebo,Shen Xipeng,et al. Learning multi-label scene classification[J]. Pattern Recognition,2004,37(9):1757-1771.
[22]Elisseeff A,Weston J. A kernel method for multi-labelled classification[C]//International Conference on Neural Information Processing Systems:Natural and Synthetic. Cambridge,UK:MIT Press,2001:681-687.
[23]Dumais S,Platt J,Heckerman D,et al. Inductive learning algorithms and representations for text categorization[J]. Computer Engineering & Design,2006(4):148-155.
[24]Lee J,Kim D W. Feature selection for multi-label classification using multivariate mutual information[J]. Pattern Recognition Letters,2013,34(3):349-357.
[25]Lin Yaojin,Hu Qinghua,Liu Jinghua,et al. Multi-label feature selection based on max-dependency and min-redundancy[J]. Neurocomputing,2015,168(C):92-103.
[26]Lee J,Kim D W. Fast multi-label feature selection based on information-theoretic feature ranking[J]. Pattern Recognition,2015,48(9):2761-2771.
[27]Duda J. Supervised and unsupervised discretization of continuous features[C]//The 12th International Conference on Machine Learning. Sydney,Australia:ICML,1995:194-202.

备注/Memo

备注/Memo:
收稿日期:2019-04-28 修回日期:2019-05-26
基金项目:国家自然科学基金(11501435); 中国纺织工业联合会科技指导性项目(2016073); 陕西省教育厅科研计划项目资助(18JK0360)
作者简介:陈红(1992-),女,硕士生,主要研究方向:机器学习,多标签学习等,E-mail:13572959949@163.com; 通讯作者:马盈仓(1972-),男,博士,教授,主要研究方向:人工智能,机器学习等,E-mail:mayingcang@126.com。
引文格式:陈红,马盈仓,杨小飞,等. 包含标签信息的最小二乘多标签特征选择算法[J]. 南京理工大学学报,2019,43(4):423-431.
投稿网址:http://zrxuebao.njust.edu.cn
更新日期/Last Update: 2019-09-30