I am trying to implement a Bayesian version of melogit. [XT] uses toweroflondon.dta as a running example.
The results are problematic: convergence is not achieved, the dreaded "autocorrelation at 500 lags" warning is shown, and the results are nowhere near those from melogit. Note that I used blocking in the priors, but I could only use so much Gibbs sampling that saved Example 21 in [BAYES] bayesmh: my attempts to add , gibbs to the blocks for the outcome variable failed with an error message
Is there anything I can do on my side to increase the efficiency of the sampler? I don't think this the issue of the number of parameters in the block (there are 200-something unique subjects) as this fails even with the two values of i.group.
Code:
use http://www.stata-press.com/data/r14/towerlondon.dta, clear melogit dtlm i.group || subject: fvset base none subject bayesmh dtlm i.group i.subject, likelihood(logit) noconstant prior({dtlm:i.subject}, normal({dtlm:_cons},{var_subj})) prior({dtlm:_cons}, normal(0, 100)) prior({dtlm:i.group}, normal(0, 100)) prior({var_subj}, igamma(0.001, 0.001)) mcmcsize(5000) block({var_subj}, gibbs) block({dtlm:i.subj}) block({dtlm:i.group}) block({dtlm:_cons}, gibbs) bayesmh dtlm i.group i.subject, likelihood(logit) noconstant prior({dtlm:i.subject}, normal({dtlm:_cons},{var_subj})) prior({dtlm:_cons}, normal(0, 100)) prior({dtlm:i.group}, normal(0, 100)) prior({var_subj}, igamma(0.001, 0.001)) mcmcsize(5000) block({var_subj}, gibbs) block({dtlm:i.subj}) block({dtlm:i.group}) block({dtlm:_cons}, gibbs) dots
Code:
Gibbs sampling not supported;
Option gibbs not allowed in block(dtlm:1bn.subj ...) for the specified prior
distribution of the parameters.
Comment