Optimization Monte Carlo: Efficient and Embarrassingly Parallel Likelihood-Free Inference

Neural Information Processing Systems (NIPS) |

We describe an embarrassingly parallel, anytime Monte Carlo method for
likelihood-free models. The algorithm starts with the view that the stochasticity
of the pseudo-samples generated by the simulator can be controlled externally
by a vector of random numbers u, in such a way that the outcome, knowing u,
is deterministic. For each instantiation of u we run an optimization procedure to
minimize the distance between summary statistics of the simulator and the data.
After reweighing these samples using the prior and the Jacobian (accounting for
the change of volume in transforming from the space of summary statistics to the
space of parameters) we show that this weighted ensemble represents a Monte
Carlo estimate of the posterior distribution. The procedure can be run embarrassingly
parallel (each node handling one sample) and anytime (by allocating
resources to the worst performing sample). The procedure is validated on six experiments.