Acknowledgements. This work was supported in part by the National Natural Science Foundation of China (Nos. 61305075, 11401185), the China Postdoctoral Science Foundation (No. 2012M520624), the Natural Science Foundation of Shandong Province (Nos. ZR2013FQ004, ZR2013DM015, ZR2015AL014), the Specialized Research Fund for the Doctoral Program of Higher Education of China (No. 20130133120014), the Fundamental Research Funds for the Central Universities (Nos. 14CX05042A, 15CX05053A, 15CX08011A, 15CX02064A) and the University-level Undergraduate Training Program for Innovation and Entrepreneurship (No. 20161349).
机构署名:
本校为其他机构
院系归属:
数学与统计学院
摘要:
In terms of L1/2 regularization, a novel feature selection method for a neural framework model has been developed in this paper. Due to the non-convex, non-smooth and non-Lipschitz characteristics of L1/2 regularizer, it is difficult to directly employ the gradient descent method in training multilayer perceptron neural networks. A smoothing technique has been considered to approximate the original L1/2 regularizer. The proposed method is a two-stage updating approach. First, a multilayer network model with smoothing L1/2 regularizer is trained...