We introduce and study a novel model-selection strategy for Bayesian learning, based
on optimal transport, along with its associated predictive posterior law: the Wasserstein population
barycenter of the posterior law over models. We first show how this estimator, termed Bayesian Wasser-
stein barycenter (BWB), arises naturally in a general, parameter-free Bayesian model-selection frame-
work, when the considered Bayesian risk is the Wasserstein distance. Examples are given, illustrating
how the BWB extends some classic parametric and non-parametric selection strategies. Furthermore,
we also provide explicit conditions granting the existence and statistical consistency of the BWB, and
discuss some of its general and specific properties, providing insights into its advantages compared to
usual choices, such as the model average estimator. Finally, we illustrate how this estimator can be
computed using the stochastic gradient descent (SGD) algorithm in Wasserstein space introduced in
a companion paper [7], and provide a numerical example for experimental validation of the proposed
method.