In this paper we bound the risk of the Gibbs and Bayes classifiers (GC and BC), when the prior is defined in terms of the data generating distribution, and the posterior is defined in terms of the observed one, as proposed by Catoni (2007). We deal with this problem from two different perspectives. From one side we briefly review and further develop the classical PAC-Bayes analysis by refining the current state-of-the-art risk bounds. From the other side we propose a novel approach, based on the concept of Algorithmic Stability, which we call Distribution Stability (DS), and develop some new risk bounds over the GC and BC based on the DS. Finally, we show that the data dependent posterior distribution associated to the data generating prior has also attractive and previously unknown properties.
|Titolo:||PAC-bayesian analysis of distribution dependent priors: Tighter risk bounds and stability analysis|
|Data di pubblicazione:||2016|
|Appare nelle tipologie:||01.01 - Articolo su rivista|