Hey all,

I am working on to implement a random effect zip model in Mata. The model includes two random effects: one in the logit part, and another is in poisson part. Just wondering if anyone could give me an example how to do a double integration in Mata.

For example, my cluster level likelihood is as follows.

real colvector hiu1u2(real colvector u1, real colvector u2, real colvector Ai, real colvector Bi, real colvector zi, real colvector yi, real scalar s1, real scalar s2, real scalar t) {

n1=rows(u1)

n2=rows(u2)

r=J(n1*n2,1,0)

for(i=1; i<=n1;++i){

eta=Ai+s1*u1[i]

for(j=1;j<=n2;++j){

zeta=Bi+s2*u2[j]

lambda=exp(zeta)

pi=1/(1+exp(-eta))

p0=zi:*log(exp(eta)+exp(-lambda))+(1:-zi)*(-lambda+yi:*zeta-lngamma(yi+1))+log(1+exp(eta))

r[(i-1)*n1+j]=p0-.5*u1[i]^2-.5*u2[j]^2

}

return(r)

}

Just dont know how to conduct the double integration over u1 & u2. I tried the Quadrature but had problem to give upper and lower bounds for u1 and u2 simultaneously.

Any suggestion will be highly appreciated!

Thank you very much!

I am working on to implement a random effect zip model in Mata. The model includes two random effects: one in the logit part, and another is in poisson part. Just wondering if anyone could give me an example how to do a double integration in Mata.

For example, my cluster level likelihood is as follows.

real colvector hiu1u2(real colvector u1, real colvector u2, real colvector Ai, real colvector Bi, real colvector zi, real colvector yi, real scalar s1, real scalar s2, real scalar t) {

n1=rows(u1)

n2=rows(u2)

r=J(n1*n2,1,0)

for(i=1; i<=n1;++i){

eta=Ai+s1*u1[i]

for(j=1;j<=n2;++j){

zeta=Bi+s2*u2[j]

lambda=exp(zeta)

pi=1/(1+exp(-eta))

p0=zi:*log(exp(eta)+exp(-lambda))+(1:-zi)*(-lambda+yi:*zeta-lngamma(yi+1))+log(1+exp(eta))

r[(i-1)*n1+j]=p0-.5*u1[i]^2-.5*u2[j]^2

}

return(r)

}

Just dont know how to conduct the double integration over u1 & u2. I tried the Quadrature but had problem to give upper and lower bounds for u1 and u2 simultaneously.

Any suggestion will be highly appreciated!

Thank you very much!

## Comment