Skip to main content Skip to navigation

Event Diary

Show all calendar items

CRiSM Seminar

- Export as iCalendar
Location: A1.01

Alastair Young, Imperial College London
Objective Bayes and Conditional Inference
In Bayesian parametric inference, in the absence of subjective prior information about the parameter of interest, it is natural to consider use of an objective prior which leads to posterior probability quantiles which have, at least to some higher order approximation in terms of the sample size, the correct frequentist interpretation. Such priors are termed probability matching priors. In many circumstances, however, the appropriate frequentist inference is a conditional one. The key contexts involve inference in multi-parameter exponential families, where conditioning eliminates the nuisance parameter, and models which admit ancillary statistics, where conditioning on the ancillary is indicated by the conditionality principle of inference. In this talk, we consider conditions on the prior under which posterior quantiles have, to high order, the correct conditional frequentist interpretation. The key motivation for the work is that the conceptually simple objective Bayes route may provide accurate approximation to more complicated frequentist procedures. We focus on the exponential family context, where it turns out that the condition for higher order conditional frequentist accuracy reduces to a condition on the model, not the prior: when the condition is satisfied, as it is in many key situations, any first order probability matching prior (in the unconditional sense) automatically yields higher order conditional probability matching. We provide numerical illustrations, discuss the relationship between the objective Bayes inference and the parametric bootstrap, as well as giving a brief account appropriate to the ancillary statistic context, where conditional frequentist probability matching is more difficult. [This is joint work with Tom DiCiccio, Cornell].

Show all calendar items