Fast Cross-Validation for Kernel-Based Algorithms

文献类型: 外文期刊

第一作者: Liu, Yong

作者: Liu, Yong;Lin, Hailun;Liao, Shizhong;Jiang, Shali;Ding, Lizhong;Wang, Weiping;Wang, Weiping

作者机构:

关键词: Approximation algorithms; Kernel; Training; Taylor series; Support vector machines; Upper bound; Computational modeling; Cross-validation; approximation; bouligand influence function; model selection; kernel methods

期刊名称:IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE ( 影响因子:16.389; 五年影响因子:18.46 )

ISSN: 0162-8828

年卷期: 2020 年 42 卷 5 期

页码:

收录情况: SCI

摘要: Cross-validation (CV) is a widely adopted approach for selecting the optimal model. However, the computation of empirical cross-validation error (CVE) has high complexity due to multiple times of learner training. In this paper, we develop a novel approximation theory of CVE and present an approximate approach to CV based on the Bouligand influence function (BIF) for kernel-based algorithms. We first represent the BIF and higher order BIFs in Taylor expansions, and approximate CV via the Taylor expansions. We then derive an upper bound of the discrepancy between the original and approximate CV. Furthermore, we provide a novel computing method to calculate the BIF for general distribution, and evaluate BIF criterion for sample distribution to approximate CV. The proposed approximate CV requires training on the full data set only once and is suitable for a wide variety of kernel-based algorithms. Experimental results demonstrate that the proposed approximate CV is sound and effective.

分类号:

  • 相关文献
作者其他论文 更多>>