您好,欢迎访问北京市农林科学院 机构知识库!

A novel classifier ensemble method with sparsity and diversity

文献类型: 外文期刊

作者: Yin, Xu-Cheng 1 ; Huang, Kaizhu 2 ; Hao, Hong-Wei 3 ; Iqbal, Khalid 1 ; Wang, Zhi-Bin 4 ;

作者机构: 1.Univ Sci & Technol Beijing, Sch Comp & Commun Engn, Dept Comp Sci & Technol, Beijing 100083, Peoples R China

2.Xian Jiaotong Liverpool Univ, Dept Elect & Elect Engn, Suzhou 215123, Peoples R China

3.Chinese Acad Sci, Inst Automat, Beijing 100190, Peoples R China

4.China Natl Engn Res Ctr Informat Technol Agr, Beijing 100097, Peoples R China

关键词: Classifier ensemble; Sparsity learning; Diversity learning; Neural network ensembles; Genetic algorithm

期刊名称:NEUROCOMPUTING ( 影响因子:5.719; 五年影响因子:4.986 )

ISSN: 0925-2312

年卷期: 2014 年 134 卷

页码:

收录情况: SCI

摘要: We consider the classifier ensemble problem in this paper. Due to its superior performance to individual classifiers, class ensemble has been intensively studied in the literature. Generally speaking, there are two prevalent research directions on this, i.e., to diversely generate classifier components, and to sparsely combine multiple classifiers. While most current approaches are emphasized on either sparsity or diversity only, we investigate the classifier ensemble by learning both sparsity and diversity simultaneously. We manage to formulate the classifier ensemble problem with the sparsity or/and diversity learning in a general framework. In particular, the classifier ensemble with sparsity and diversity can be represented as a mathematical optimization problem. We then propose a heuristic algorithm, capable of obtaining ensemble classifiers with consideration of both sparsity and diversity. We exploit the genetic algorithm, and optimize sparsity and diversity for classifier selection and combination heuristically and iteratively. As one major contribution, we introduce the concept of the diversity contribution ability so as to select proper classifier components and evolve classifier weights eventually. Finally, we compare our proposed novel method with other conventional classifier ensemble methods such as Bagging, least squares combination, sparsity learning, and AdaBoost, extensively on UCI benchmark data sets and the Pascal Large Scale Learning Challenge 2008 webspam data. The experimental results confirm that our approach leads to better performance in many aspects. (C) 2014 Elsevier B.V. All rights reserved.

  • 相关文献
作者其他论文 更多>>