On Prior Selection and Covariate Shift of β-Bayesian Prediction Under α-Divergence Risk

Author: Suzuki Taiji  

Publisher: Taylor & Francis Ltd

ISSN: 0361-0926

Source: Communications in Statistics: Theory and Methods, Vol.39, Iss.8-9, 2010-01, pp. : 1655-1673

Disclaimer: Any content in publications that violate the sovereignty, the constitution or regulations of the PRC is not accepted or approved by CNPIEC.

Previous Menu Next

Abstract

We investigate the prior selection problem for predicting an input-output relation by a generalized Bayesian method, α-Bayes prediction. The α-Bayes predictive distribution is given by minimizing the Bayes risk corresponding to the α-divergence that is a generalization of the Kullback-Leibler divergence. It is known that the effect of the prior to the performance of the usual Bayesian predictive distribution measured by the Kullback-Leibler divergence from the true distribution is asymptotically characterized by the Laplacian. We show that the α-divergence between the β-Bayes predictive distribution for next outputs and the true output distribution also has a similar characterization even if α might be different from β. We also investigate how the performance of the generalized Bayesian prediction behaves if the test and training input distributions are different.