Body

The subjective prior distribution is often regarded as the primary source of uncertainty for the Bayesian. In this paper, we present a different perspective on Bayesian uncertainty. Given a finite sample Y1:n of size n from an infinite population, the missing Yn+1:∞ is the source of statistical uncertainty, with the parameter of interest being known given Y1:∞. We argue that the foundation of Bayesian inference is to assign a predictive distribution on Yn+1:∞ conditional on Y1:n, which then induces a distribution on the parameter. Demonstrating an application of martingales, Doob shows us that choosing the Bayesian predictive distribution returns the conventional posterior as the distribution of the parameter. Taking this as our cue, we relax the predictive machine, avoiding the need for the predictive distribution to be derived solely from the usual prior to posterior to predictive density formula. We introduce the martingale posterior, which returns us Bayesian uncertainty directly on the statistic of interest without the need for the likelihood and prior, and the distribution can be sampled through a computational scheme we name predictive resampling. To that end, we introduce new predictive methodologies for multivariate density estimation, regression and classification that build upon recent work on bivariate copulas.