Skip to main content Skip to navigation

Paper No. 08-12

Download 08-12

JQ Smith and A Daneshkhah

Large incomplete sample robustness in Bayesian networks

Date: May 2008

Abstract: Under local DeRobertis (LDR) separation measures, the posterior distances between two densities is the same as between the prior densities. Like Kullback-Leibler separation they also are additive under factorization. These two properties allow us the prove that the precise specification of the prior will not be critical with respect to the variation distance on the posteriors under the following conditions. The genuine and approximating prior need to be similarly rough, the approximating prior has concentrated on a small ball on the margin of interest, not on the boundary of the probability space, and the approximating prior has similar or fatter tails to the genuine prior . Robustness then follows for all likelihoods, even ones that are misspecified. Furthermore, the variation distances can be bounded explicitly by a easy to calculate function the prior LDR and simple summary statistics of the functioning posterior. In this paper we apply these results to study the robustness of prior specification to learning Bayesian Networks.