

Author: Sugiyama M.
Publisher: Springer Publishing Company
ISSN: 0885-6125
Source: Machine Learning, Vol.48, Iss.1-3, 2002-07, pp. : 25-50
Disclaimer: Any content in publications that violate the sovereignty, the constitution or regulations of the PRC is not accepted or approved by CNPIEC.
Abstract
Recently, a new model selection criterion called the subspace information criterion (SIC) was proposed. SIC works well with small samples since it gives an unbiased estimate of the generalization error with finite samples. In this paper, we theoretically and experimentally evaluate the effectiveness of SIC in comparison with existing model selection techniques including the traditional leave-one-out cross-validation (CV), Mallows's C, Akaike's information criterion (AIC), Sugiura's corrected AIC (cAIC), Schwarz's Bayesian information criterion (BIC), Rissanen's minimum description length criterion (MDL), and Vapnik's measure (VM). Theoretical evaluation includes the comparison of the generalization measure, approximation method, and restriction on model candidates and learning methods. Experimentally, the performance of SIC in various situations is investigated. The simulations show that SIC outperforms existing techniques especially when the number of training examples is small and the noise variance is large.
Related content


Information-Based Evaluation Criterion for Classifier's Performance
By Kononenko I.
Machine Learning, Vol. 06, Iss. 1, 1991-01 ,pp. :




By Okamoto Yoshifumi Kameari Akihisa Fujiwara Koji Tsuburaya Tomonori Sato Shuji
COMPEL: Int J for Computation and Maths. in Electrical and Electronic Eng., Vol. 34, Iss. 5, 2015-09 ,pp. :

