by H.E. Ogden

Biometrika (2017) 104 (1): 153-164.

Summary

Many statistical models have likelihoods which are intractable: it is impossible or too expensive to compute the likelihood exactly. In such settings, a common approach is to replace the likelihood with an approximation, and proceed with inference as if the approximate likelihood were the true likelihood. In this paper, we describe conditions which guarantee that such naive inference with an approximate likelihood has the same first-order asymptotic properties as inference with the true likelihood. We investigate the implications of these results for inference using a Laplace approximation to the likelihood in a simple two-level latent variable model and using reduced dependence approximations to the likelihood in an Ising model.Many statistical models have likelihoods which are intractable: it is impossible or too expensive to compute the likelihood exactly. In such settings, it is common to replace the likelihood with an approximation, and proceed with inference as if the approximate likelihood were the exact likelihood. In this paper, we describe conditions on the approximate likelihood which guarantee that inference with the approximate likelihood has the same first-order asymptotic properties as exact likelihood inference. We investigate the implications of these results in a simple two-level latent variable model, determining the number of repeated observations on each item required for the Laplace approximation to give asymptotically correct inference.