Bayesian Quasi-Likelihood
Estimation of nonlinear GMM involves the use of optimization solvers. An alternative approach that circumvents the need for such solvers treats the criterion function as a quasi-likelihood function for tracing out quasi-posterior functions of the parameters, using a Markov Chain Monte Carlo (MCMC) method (Chernozhukov and Hong, 2003). Specifically, we multiply the criterion function shown in Generalized Method of Moments by $-\frac{1}{2}$ to obtain the quasi-likelihood.
MethodOfMoments.jl provides support for incorporating the quasi-likelihood evaluation in a MCMC sampler by implementing the LogDensityProblems.jl interface. This allows the users to leverage readily available MCMC samplers from the Julia ecosystem without the need to defining the quasi-likelihood functions from scratch. For example, packages such as AdvancedMH.jl for Metropolis-Hastings algorithms recognize the capability of MethodOfMoments.jl for evaluating the log-density.
Example: Defining Log-Density for MCMC
We reuse the data and specifications from Example: Exponential Regression with Instruments.
# Assume objects from previous example are already defined
using Distributions
params = (:private=>Uniform(-1,2), :chronic=>Uniform(-1,2),
:female=>Uniform(-1,2), :income=>Uniform(-1,2), :cons=>Normal())
m = BayesianGMM(vce, g, dg, params, 7, length(data))
BayesianGMM with 7 moments and 5 parameters:
private = 6.89930e-310 chronic = 6.89930e-310 female = 6.89932e-310 income = 6.89934e-310 cons = 2.03883e-312
log(posterior) = NaN
Above, we provide the names of each parameter and their prior distributions using distributions defined in Distributions.jl. A BayesianGMM
contains the ingredients required for computing the log-posterior:
θ = [0.5, 1, 1, 0.1, 0.1]
logposterior!(m, θ)
-5.321068026621552
To run a Metropolis-Hastings sampler, we may proceed as follows:
using AdvancedMH, MCMCChains, LinearAlgebra
spl = MetropolisHastings(RandomWalkProposal{true}(MvNormal(zeros(5), 0.5*I(5))))
N = 10_000
chain = sample(m, spl, N, init_params=θ, param_names=m.params, chain_type=Chains)
Chains MCMC chain (10000×6×1 Array{Float64, 3}):
Iterations = 1:1:10000
Number of chains = 1
Samples per chain = 10000
parameters = private, chronic, female, income, cons
internals = lp
Summary Statistics
parameters mean std mcse ess_bulk ess_tail rhat e ⋯
Symbol Float64 Float64 Float64 Float64 Float64 Float64 ⋯
private 0.6135 0.8495 0.0663 167.1937 403.9743 1.0069 ⋯
chronic 0.4956 0.8684 0.0628 190.7524 397.5548 1.0064 ⋯
female 0.3998 0.8350 0.0520 267.8485 455.8833 1.0171 ⋯
income 0.9528 0.8707 0.0739 177.5587 550.4496 1.0038 ⋯
cons -0.0210 0.9606 0.0740 169.8596 240.1752 1.0068 ⋯
1 column omitted
Quantiles
parameters 2.5% 25.0% 50.0% 75.0% 97.5%
Symbol Float64 Float64 Float64 Float64 Float64
private -0.9017 -0.0677 0.6303 1.3315 1.9358
chronic -0.9257 -0.2675 0.4764 1.2676 1.9092
female -0.9229 -0.3235 0.3384 1.1113 1.9074
income -0.3735 -0.0080 1.3822 1.6965 1.9820
cons -1.9312 -0.6767 -0.0535 0.6626 1.8920