Skip to main content Skip to navigation

Paper No. 08-28

Download 08-28

JQ Smith and A Daneshkhah

On the Robustness of Bayesian Networks to Learning from Non-conjugate sampling

Abstract: Under local DeRobertis separation measures, the posterior distances between two densities is the same as between the prior densities. Like Kullback-Leibler separation they are also additive under factorisation. These two properties allow us to prove that the precise specification of the prior will not be critical with respect to the variation distance on the posteriors under the following conditions. The genuine and approximating priors need to be similarly rough, the approximating prior must have concentrated on a small ball on the margin of interest, not on the boundary of the probability space, and the approximating prior must have similar or fatter tails to the genuine prior. Robustness then follows for all likelihoods, even ones that are misspecified. Furthermore, the total variation distances can be bounded explicitly by an easy to calculate function of the prior local DeRobertis separation measures and simple summary statistics of the functioning posterior. In this paper we apply these results to study the robustness of prior specification to learning Bayesian networks.

Keywords: Bayesian networks, Bayesian robustness, isoseparation property, local DeRobertis separation measures, total variation distance.