Bayesian Quasi-Likelihood

Estimation of nonlinear GMM involves the use of optimization solvers. An alternative approach that circumvents the need for such solvers treats the criterion function as a quasi-likelihood function for tracing out quasi-posterior functions of the parameters, using a Markov Chain Monte Carlo (MCMC) method (Chernozhukov and Hong, 2003). Specifically, we multiply the criterion function shown in Generalized Method of Moments by $-\frac{1}{2}$ to obtain the quasi-likelihood.

MethodOfMoments.jl provides support for incorporating the quasi-likelihood evaluation in a MCMC sampler by implementing the LogDensityProblems.jl interface. This allows the users to leverage readily available MCMC samplers from the Julia ecosystem without the need to defining the quasi-likelihood functions from scratch. For example, packages such as AdvancedMH.jl for Metropolis-Hastings algorithms recognize the capability of MethodOfMoments.jl for evaluating the log-density.

Example: Defining Log-Density for MCMC

We reuse the data and specifications from Example: Exponential Regression with Instruments.

# Assume objects from previous example are already defined
using Distributions
params = (:private=>Uniform(-1,2), :chronic=>Uniform(-1,2),
    :female=>Uniform(-1,2), :income=>Uniform(-1,2), :cons=>Normal())
m = BayesianGMM(vce, g, dg, params, 7, length(data))
BayesianGMM with 7 moments and 5 parameters:
  private = 6.92189e-310  chronic = 6.92189e-310  female = 6.92189e-310  income = 6.92189e-310  cons = 6.92189e-310
  log(posterior) = NaN

Above, we provide the names of each parameter and their prior distributions using distributions defined in Distributions.jl. A BayesianGMM contains the ingredients required for computing the log-posterior:

θ = [0.5, 1, 1, 0.1, 0.1]
logposterior!(m, θ)
-5.321068026621552

To run a Metropolis-Hastings sampler, we may proceed as follows:

using AdvancedMH, MCMCChains, LinearAlgebra

spl = MetropolisHastings(RandomWalkProposal{true}(MvNormal(zeros(5), 0.5*I(5))))
N = 10_000
chain = sample(m, spl, N, init_params=θ, param_names=m.params, chain_type=Chains)
Chains MCMC chain (10000×6×1 Array{Float64, 3}):

Iterations        = 1:1:10000
Number of chains  = 1
Samples per chain = 10000
parameters        = private, chronic, female, income, cons
internals         = lp

Summary Statistics
  parameters      mean       std      mcse   ess_bulk   ess_tail      rhat   e ⋯
      Symbol   Float64   Float64   Float64    Float64    Float64   Float64     ⋯

     private    0.5323    0.8566    0.0531   265.1415   491.3533    1.0019     ⋯
     chronic    0.5141    0.8472    0.0565   238.3058   426.5459    1.0047     ⋯
      female    0.3885    0.8575    0.0603   205.3472   372.6243    1.0132     ⋯
      income    0.7730    0.9024    0.0737   198.0124   518.6954    1.0008     ⋯
        cons   -0.2151    0.9573    0.0706   184.5890   260.8145    1.0122     ⋯
                                                                1 column omitted

Quantiles
  parameters      2.5%     25.0%     50.0%     75.0%     97.5%
      Symbol   Float64   Float64   Float64   Float64   Float64

     private   -0.9485   -0.2283    0.5559    1.2898    1.9213
     chronic   -0.9207   -0.1520    0.5104    1.2146    1.9139
      female   -0.9650   -0.3514    0.3672    1.1316    1.8668
      income   -0.4480   -0.1052    1.2672    1.6369    1.9587
        cons   -1.9530   -0.9298   -0.2247    0.4537    1.6360