Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Stata and LIMDEP Yield very different results

    I estimate a trivariate probit model in stata using the user written mvprobit command. The results (both coefficients and standard errors) are very different from that obtained using LIMDEP. The difference is indeed shocking. Does anyone know why this might be the case?

  • #2
    It is unlikely that anyone will be able to answer this question without your showing us the actual commands you used and the results you got from both programs. Please put those in a code block (see FAQ#12, 7th paragraph for instructions on how to create a code block) so they will be easily readable. Also, make sure you show exactly the commands and outputs by copying from the Results window or your log and pasting into that code block--small details are often crucial, and retyping can hide the source of the problem or create red herrings.

    Comment


    • #3
      Welcome to Statalist. Please READ THE FAQ BEFORE POSTING AND ACT ON IT! (Hit the black bar at the top of the page.) Yes, I know that using upper case letters means "shouting"; I am! One of the main lessons of the FAQ for new users is that you will maximize the chances of getting helpful answers if you provide full and detailed information, and it tells you not to make assumptions about your audience. Etc. Etc.

      You have not told us the provenance of mvprobit. You have not told us precisely what the estimators are that LIMDEP and mvprobit are using. You have not shown us precisely what commands you gave to LIMDEP and Stata and the output you got back in each case. (Read the FAQ on how to use CODE delimiters to do this.)

      I think both implementations are using Maximum Simulated Likelihood. But are they each using pseudo-normal or Halton draws for simulation? How many draws are being used, and so on? The default number of draws in mvprobit is deliberately set small (too small for real world applications), but the associated help and Stata Journal article (that you are going to provide the full citation for) remind users to use the relevant option to set a number of draws that is appropriate to the context (typically at least as big as the square root of the sample size).

      In short, I suspect that you are not comparing like with like. What I do know is that you have not provided sufficient information for me or anyone else to provide a more informed response.

      Comment


      • #4
        Advise well taken, i will read the FAQ in detail and revert appropriately. Thanks.

        Comment


        • #5
          I provide the commands used for estimating the models as well as the results obtained below. The sample size is 650 and I used 200 draws in both programs (far larger than the rule of thumb number of 26 which is the square root of the sample size). LIMDEP has an option for specifying either a pseudo-random number generator or other sequences (e.g. Halton) but here I use the default which is pseudo-normal. The basic specification for the estimated model is:

          y1 = f1(y2, y3, x1, x2, x3)
          y2 = f2(x3, x5, x6)
          y3 = f3(x4, x7, x8)
          ------------------ Stata Command ----------------------------
          Code:
          set seed 1003
          mvprobit (y1 y2 y3 x1 x2 x3) (y2 x2 x3 x4 x5) (y3 x2 x3 x4 x6), cl(vid) dr(200) nolog



          ------------------ Stata Results ----------------------------

          Code:
          Multivariate probit (MSL, # draws = 200)          Number of obs   =        650
                                                            Wald chi2(13)   =      28.52
          Log pseudolikelihood = -1244.4935                 Prob > chi2     =     0.0076
          
                                             (Std. Err. adjusted for 66 clusters in vid)
          ------------------------------------------------------------------------------
                       |               Robust
                       |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
          -------------+----------------------------------------------------------------
          y1           |
                    y2 |   .0255929   3.244752     0.01   0.994    -6.334004     6.38519
                    y3 |   .0856454   1.222501     0.07   0.944    -2.310413    2.481704
                    x1 |  -.0070944   .0039819    -1.78   0.075    -.0148989      .00071
                    x2 |   .4653447   .1979839     2.35   0.019     .0773033    .8533861
                    x3 |   .0155855   .0106737     1.46   0.144    -.0053345    .0365056
                 _cons |    .013977   .9984963     0.01   0.989     -1.94304    1.970994
          -------------+----------------------------------------------------------------
          y2           |
                    x2 |    .180146   .0825457     2.18   0.029     .0183593    .3419326
                    x3 |  -.0028735   .0070531    -0.41   0.684    -.0166972    .0109503
                    x4 |   .0407072   .3036945     0.13   0.893     -.554523    .6359375
                    x5 |   .0512205   .2306407     0.22   0.824    -.4008268    .5032679
                 _cons |  -.3542921   .1160709    -3.05   0.002    -.5817869   -.1267972
          -------------+----------------------------------------------------------------
          y3           |
                    x2 |   .1742454   .0837303     2.08   0.037     .0101369    .3383538
                    x3 |   .0030403    .005963     0.51   0.610    -.0086469    .0147274
                    x4 |   .1718975   .1390814     1.24   0.216    -.1006969    .4444919
                    x6 |   .4446297   .2315665     1.92   0.055    -.0092323    .8984917
                 _cons |  -.5539481   .1171162    -4.73   0.000    -.7834916   -.3244046
          -------------+----------------------------------------------------------------
              /atrho21 |  -.0125052   1.836049    -0.01   0.995    -3.611096    3.586086
          -------------+----------------------------------------------------------------
              /atrho31 |  -.1052502   .4538494    -0.23   0.817    -.9947786    .7842782
          -------------+----------------------------------------------------------------
              /atrho32 |   .4215393   .0826255     5.10   0.000     .2595963    .5834823
          -------------+----------------------------------------------------------------
                 rho21 |  -.0125045   1.835762    -0.01   0.995    -.9985407    .9984659
          -------------+----------------------------------------------------------------
                 rho31 |  -.1048633   .4488587    -0.23   0.815    -.7593926    .6551554
          -------------+----------------------------------------------------------------
                 rho32 |   .3982264   .0695224     5.73   0.000     .2539179    .5251918
          ------------------------------------------------------------------------------
          Likelihood ratio test of  rho21 = rho31 = rho32 = 0:  
                       chi2(3) =  39.4145   Prob > chi2 = 0.0000

          ------------------ LIMDEP Command ----------------------------
          Code:
          Sample    ;all $
          Skip $
          Mprobit    ;Lhs=y1,y2,y3
                  ;eq1=y2,y3,x1,x2,x3,one
                  ;eq2=x1,x2,x3,x4,x5,one
                  ;eq3=x2,x3,x4,x6,one                
                  ;pts=200;cluster=vid $


          ------------------ LIMDEP Results ----------------------------
          Code:
          Normal exit from iterations. Exit status=0.
          
          +---------------------------------------------+
          | Multivariate Probit Model:  3 equations.    |
          | Maximum Likelihood Estimates                |
          | Model estimated: Mar 17, 2016 at 00:29:05AM.|
          | Dependent variable             MVProbit     |
          | Weighting variable                 None     |
          | Number of observations              650     |
          | Iterations completed                 33     |
          | Log likelihood function       -1242.182     |
          | Number of parameters                 20     |
          | Info. Criterion: AIC =          3.88364     |
          |   Finite Sample: AIC =          3.88569     |
          | Info. Criterion: BIC =          4.02139     |
          | Info. Criterion:HQIC =          3.93707     |
          | Replications for simulated probs. = 200     |
          +---------------------------------------------+
          +---------------------------------------------------------------------+
          | Covariance matrix for the model is adjusted for data clustering.    |
          | Sample of    650 observations contained     66 clusters defined by  |
          | variable VID      which identifies by a value a cluster ID.         |
          | Sample of    650 observations contained      1 strata defined by    |
          |    650 observations (fixed number) in each stratum.                 |
          +---------------------------------------------------------------------+
          +--------+--------------+----------------+--------+--------+----------+
          |Variable| Coefficient  | Standard Error |b/St.Er.|P[|Z|>z]| Mean of X|
          +--------+--------------+----------------+--------+--------+----------+
          ---------+Index function for Y1
           Y2      |     .84344025      5.74939391      .147   .8834    .38000000
           Y3      |     .42013852      3.25958960      .129   .8974    .35538462
           X1      |    -.00806300       .00860756     -.937   .3489   46.6107692
           X2      |     .31793308       .71794537      .443   .6579    .34307692
           X3      |     .01468727       .00840065     1.748   .0804   11.2224103
           Constant|    -.31411643       .77762634     -.404   .6863
          ---------+Index function for Y2
           X1      |     .00779023       .00104284     7.470   .0000   46.6107692
           X2      |     .16831606       .00775319    21.709   .0000    .34307692
           X3      |    -.00663025       .00072659    -9.125   .0000   11.2224103
           X4      |     .14847431       .14394530     1.031   .3023    .28923077
           X5      |     .04544672       .05689766      .799   .4244    .11538462
           Constant|    -.70426701       .08723116    -8.074   .0000
          ---------+Index function for Y3
           X2      |     .17303656       .01161279    14.901   .0000    .34307692
           X3      |     .00272569       .00050864     5.359   .0000   11.2224103
           X4      |     .19051601       .05574615     3.418   .0006    .28923077
           X6      |     .42854019       .04664493     9.187   .0000    .07692308
           Constant|    -.55390573       .03198934   -17.315   .0000
          ---------+Correlation coefficients
           R(01,02)|    -.58312487      3.20681595     -.182   .8557
           R(01,03)|    -.44115615      1.23356392     -.358   .7206
           R(02,03)|     .40497242       .00399374   101.402   .0000

          Comment


          • #6
            Here is the data:

            Code:
            * Example generated by -dataex-. To install: ssc install dataex
            clear
            input long id int vid byte(y1 y2 y3 x1 x2) float x3 byte(x4 x5 x6)
            1010101 10101 1 1 1  28 0     5 1 0 0
            1010102 10101 1 0 0  33 0     6 0 1 0
            1010103 10101 1 1 1  61 2    25 0 0 0
            1010104 10101 1 0 0  44 1    15 0 0 0
            1010105 10101 1 1 1  44 2  24.5 0 0 0
            1010106 10101 0 0 1  50 0    12 0 0 0
            1010107 10101 1 1 1  69 3    20 0 0 0
            1010108 10101 1 0 0  59 2  16.5 0 0 0
            1010109 10101 1 1 1  44 2    10 0 0 1
            1010110 10101 1 0 0  59 3    25 0 0 1
            1010201 10102 1 0 0  39 0     6 0 0 0
            1010202 10102 1 0 1  36 0 11.33 1 0 0
            1010203 10102 1 0 0  44 0 10.67 0 0 0
            1010204 10102 1 1 1  23 2  7.67 1 0 0
            1010205 10102 1 0 1  74 2    15 0 0 0
            1010206 10102 1 0 0  33 2  7.67 1 0 0
            1010207 10102 1 0 1  84 0    10 1 0 0
            1010208 10102 1 0 0  40 1     5 0 1 0
            1010209 10102 1 0 0  74 0   6.5 0 0 1
            1010210 10102 1 1 1  29 0    10 0 1 0
            1010301 10103 1 1 0  59 1    24 0 1 0
            1010302 10103 1 1 0  32 0    10 1 0 0
            1010303 10103 1 1 0  38 0    18 0 0 0
            1010304 10103 1 0 0  34 1  12.5 0 0 0
            1010305 10103 1 0 0  54 1    10 1 0 0
            1010306 10103 1 0 0  25 1     9 0 0 0
            1010307 10103 1 0 0  65 0    11 1 0 0
            1010308 10103 1 0 0  31 1  12.5 1 0 0
            1010309 10103 1 0 0  47 0    13 0 0 0
            1010310 10103 1 1 0  29 0  17.5 0 0 0
            1010401 10104 1 1 1  54 1    21 0 1 0
            1010402 10104 1 1 1  80 3    15 1 0 0
            1010403 10104 1 0 0  70 1    15 0 0 0
            1010404 10104 1 0 0  85 1    19 0 0 1
            1010405 10104 1 1 1  50 1     9 0 0 0
            1010406 10104 1 1 1  57 0    18 0 0 0
            1010407 10104 1 1 1  55 1     6 0 0 0
            1010408 10104 0 1 1  69 0   5.5 0 0 1
            1010409 10104 1 1 1  56 1    10 0 0 1
            1010410 10104 0 1 1  69 0     5 0 0 1
            1020101 10201 1 0 0  25 0     3 0 0 0
            1020102 10201 0 0 1  46 0     6 0 0 0
            1020103 10201 0 1 0  37 0   7.5 0 0 0
            1020104 10201 1 0 0  62 0     6 0 0 0
            1020105 10201 0 1 0  73 0     3 0 0 0
            1020106 10201 1 0 0  24 0     3 1 0 0
            1020107 10201 0 0 0  36 0     3 1 0 0
            1020108 10201 0 1 0  62 0     5 0 0 0
            1020109 10201 0 1 0  48 0     5 0 0 0
            1020110 10201 0 0 0  38 0     4 1 0 0
            1020201 10202 1 0 1  46 0    20 1 0 0
            1020202 10202 0 0 0  48 0   7.8 0 0 0
            1020203 10202 1 1 0  50 0     5 1 0 0
            1020204 10202 1 0 0  42 0    10 0 0 0
            1020205 10202 1 1 0  32 0     5 0 0 0
            1020206 10202 1 0 0  76 0    25 1 0 0
            1020207 10202 1 0 0  56 0    20 0 0 0
            1020208 10202 0 0 0  35 0     5 0 0 0
            1020209 10202 1 1 0  32 0  11.5 0 0 0
            1020210 10202 0 0 0  65 0     4 0 0 0
            1020301 10203 0 0 0  80 0    10 0 0 0
            1020302 10203 0 0 0  75 0   6.5 0 0 0
            1020303 10203 0 0 1  82 0    10 0 0 0
            1020304 10203 0 0 0  50 0     8 0 0 0
            1020305 10203 0 0 0  57 0     4 0 0 0
            1020306 10203 0 0 0  60 0    10 0 0 0
            1020307 10203 0 0 0  90 0     4 0 0 0
            1020308 10203 0 0 0  50 0     1 0 0 0
            1020309 10203 1 1 0  50 0     4 0 0 0
            1020310 10203 0 1 0  54 0     4 0 0 0
            1020401 10204 0 0 1  40 0    11 0 0 0
            1020402 10204 0 1 0  50 0     5 1 0 0
            1020403 10204 1 1 0  30 0  11.5 1 0 0
            1020404 10204 0 0 0  39 0     6 0 0 0
            1020405 10204 1 0 0  84 0    15 0 0 0
            1020406 10204 1 0 0  72 0     4 1 0 0
            1020407 10204 0 0 0 100 0    15 0 0 0
            1020408 10204 0 0 0  42 0     5 0 0 0
            1020409 10204 0 0 1  32 0     5 1 0 0
            1020410 10204 0 0 0  33 0    10 1 0 0
            1030101 10301 1 0 0  37 0    15 0 0 0
            1030102 10301 0 0 0  34 0     5 1 0 0
            1030103 10301 1 0 0  34 0    10 0 0 0
            1030104 10301 1 0 0  34 0     7 1 0 0
            1030105 10301 1 0 0  29 0     9 1 0 0
            1030106 10301 1 0 0  31 0  14.5 1 1 0
            1030107 10301 1 0 0  40 0    26 1 0 0
            1030108 10301 1 0 1  47 0    25 0 0 0
            1030109 10301 1 0 0  50 0    22 0 0 0
            1030110 10301 1 0 1  33 0   7.5 0 0 0
            1030301 10303 0 0 0  65 0 12.67 0 0 0
            1030302 10303 1 0 0  30 0    10 1 0 0
            1030303 10303 1 0 0  23 0     5 1 0 0
            1030304 10303 1 0 0  20 0   4.5 0 0 0
            1030305 10303 1 0 0  29 0 12.67 0 1 0
            1030306 10303 1 0 0  25 0     4 1 0 0
            1030307 10303 1 0 0  54 0     3 0 0 0
            1030308 10303 1 0 0  57 0  7.33 0 0 0
            1030309 10303 0 0 0  42 0     3 0 0 0
            1030310 10303 1 0 0  39 0     3 0 0 0
            1030401 10304 1 1 1  37 0    15 0 0 0
            1030402 10304 0 0 0  29 0    10 0 0 0
            1030403 10304 1 0 0  36 0 10.67 0 0 0
            1030404 10304 0 0 0  49 0     5 0 0 0
            1030405 10304 1 0 0  24 0     7 0 0 0
            1030406 10304 1 0 0  28 0    10 0 0 0
            1030407 10304 1 0 1  23 0     4 0 1 0
            1030408 10304 1 0 0  24 0  3.67 1 0 0
            1030409 10304 1 0 1  29 0    12 0 0 0
            1030410 10304 1 0 0  45 0    10 0 0 0
            1030501 10305 0 0 0  35 3     5 0 1 0
            1030502 10305 1 0 1  26 1  3.33 1 0 0
            1030503 10305 0 1 1  74 3    35 0 0 0
            1030504 10305 1 0 1  36 1     7 1 0 0
            1030505 10305 1 0 0  39 3     3 0 0 0
            1030506 10305 1 0 0  40 0     9 1 0 0
            1030507 10305 0 0 0  32 3     7 1 0 0
            1030508 10305 1 0 1  31 0     7 1 0 0
            1030509 10305 0 0 0  34 3    15 0 0 0
            1030510 10305 1 0 1  26 0     5 1 0 0
            1030601 10306 1 0 1  24 0     6 1 0 0
            1030602 10306 1 0 0  41 1     9 1 0 0
            1030603 10306 0 0 0  62 0  29.5 0 0 0
            1030604 10306 1 0 0  65 0  12.5 0 0 0
            1030605 10306 1 0 1  27 0   5.5 1 0 0
            1030606 10306 1 0 0  34 0     7 1 0 0
            1030607 10306 0 0 1  42 0    23 1 1 0
            1030608 10306 1 0 0  84 0     5 0 0 1
            1030609 10306 1 0 0  44 1    20 0 0 0
            1030610 10306 1 0 0  37 0   8.5 0 0 0
            1030701 10307 0 0 0  41 0    10 0 0 0
            1030702 10307 0 0 1  51 0    18 0 1 0
            1030703 10307 0 0 0  74 0   2.5 0 0 0
            1030704 10307 0 1 1  31 0   9.5 0 0 0
            1030705 10307 0 0 0  49 0     4 0 0 0
            1030706 10307 0 0 1  41 0     7 1 0 0
            1030707 10307 0 0 0  27 0     6 0 0 0
            1030708 10307 0 0 1  40 0    19 0 0 0
            1030709 10307 0 0 0  74 0    15 0 0 0
            1030710 10307 0 0 0  50 0    17 0 0 0
            1030801 10308 0 1 0  65 0    20 0 0 0
            1030802 10308 0 0 0  36 0     6 1 0 0
            1030803 10308 1 0 1  35 1  8.33 1 0 0
            1030804 10308 0 1 1  32 0    10 0 1 1
            1030805 10308 1 0 0  40 0     6 1 0 0
            1030806 10308 1 0 1  34 1  9.67 1 1 1
            1030807 10308 1 0 0  65 0    25 0 1 0
            1030808 10308 1 1 0  36 0    12 1 0 0
            1030809 10308 0 0 0  35 0    10 0 0 0
            1030810 10308 1 0 0  38 3     7 0 0 0
            1030901 10309 0 0 1  32 0    10 0 1 0
            1030902 10309 0 0 0  36 0   5.5 1 0 0
            1030903 10309 1 0 0  53 0  16.5 0 0 0
            1030904 10309 0 0 0  64 0    15 0 0 0
            1030905 10309 0 1 0  45 0  13.5 1 0 0
            1030906 10309 0 0 0  51 0    14 0 0 0
            1030907 10309 1 0 0  72 0  22.5 0 0 0
            1030908 10309 1 0 0  29 0   2.5 1 0 0
            1030909 10309 0 1 0  74 0    20 0 0 0
            1030910 10309 1 0 0  54 0    32 0 0 0
            1031001 10310 1 1 1  45 0    25 1 0 0
            1031002 10310 1 1 1  35 0     7 1 0 0
            1031003 10310 1 0 0  27 0    10 1 0 0
            1031004 10310 1 0 0  35 0    24 0 0 0
            1031005 10310 1 0 0  39 0     5 0 0 0
            1031006 10310 1 0 0  49 0     4 1 0 0
            1031007 10310 0 0 0  20 0     3 0 0 0
            1031008 10310 1 0 0  54 0    15 0 0 0
            1031009 10310 0 0 0  54 0     4 0 0 0
            1031010 10310 0 0 0  35 0     5 0 0 0
            1031101 10311 1 0 0  36 0    10 0 0 0
            1031102 10311 1 0 1  34 0    15 0 0 0
            1031103 10311 1 0 0  46 0     6 0 0 0
            1031104 10311 1 0 1  34 0    15 0 0 0
            1031105 10311 0 0 0  24 0     3 0 1 0
            1031106 10311 0 1 1  45 0   5.5 1 0 0
            1031107 10311 1 0 0  64 0    30 0 0 0
            1031108 10311 0 1 1  64 0     8 1 0 0
            1031109 10311 0 1 1  50 0     8 0 0 0
            1031110 10311 1 0 1  35 0   6.5 1 0 0
            1040101 10401 1 1 1  70 0     9 0 0 0
            1040102 10401 1 1 1  50 0     5 0 1 0
            1040103 10401 1 0 0  60 0  8.67 0 0 0
            1040104 10401 1 1 0  64 0     4 0 0 1
            1040105 10401 0 1 0  42 0    11 0 0 0
            1040106 10401 0 0 1  42 0    20 0 0 0
            1040107 10401 1 0 0  41 0   2.5 0 0 0
            1040108 10401 0 1 0  55 0    20 0 0 0
            1040109 10401 0 1 0  39 0     4 0 0 0
            1040110 10401 1 1 1  42 0    15 0 0 0
            1040201 10402 0 0 0  37 0    20 0 0 0
            1040202 10402 1 1 1  41 0  12.5 1 0 0
            1040203 10402 0 0 1  36 0     8 0 0 0
            1040204 10402 1 0 0  42 0   6.5 1 0 0
            1040205 10402 1 1 1  39 0    15 0 0 0
            1040206 10402 1 0 0  35 0    15 0 0 1
            1040207 10402 0 1 0  64 0     5 0 0 0
            1040208 10402 1 0 1  51 0    15 0 0 0
            1040209 10402 1 1 0  61 0    30 0 0 0
            1040210 10402 1 1 0  77 0    13 0 0 0
            1040301 10403 1 0 0  31 0  8.33 1 0 0
            1040302 10403 1 0 0  42 0 19.67 1 1 0
            1040303 10403 0 0 0  28 0     3 0 0 0
            1040304 10403 0 1 0  34 0    20 1 1 0
            1040305 10403 1 0 0  50 0     5 0 0 0
            1040306 10403 0 1 0  47 0     4 0 0 0
            1040307 10403 0 1 1  29 1  2.33 1 1 0
            1040308 10403 0 0 0  44 0    30 0 0 0
            1040309 10403 0 0 1  48 0    15 0 0 0
            1040310 10403 0 1 1  51 0    15 0 0 1
            1040401 10404 1 0 1  39 0  10.5 1 0 0
            1040402 10404 1 1 1  27 0     3 0 0 0
            1040403 10404 1 1 1  45 0     5 0 0 0
            1040404 10404 1 0 0  40 0   4.5 0 0 0
            1040405 10404 0 1 1  40 0     7 0 0 0
            1040406 10404 1 0 0  70 0  16.5 0 0 0
            1040407 10404 1 1 1  28 0     3 0 0 0
            1040408 10404 0 0 1  29 0   6.5 1 0 0
            1040409 10404 1 0 0  35 0    12 1 0 0
            1040410 10404 0 0 0  56 0   6.5 0 0 0
            1040601 10406 0 0 0  69 0    16 1 0 0
            1040602 10406 0 0 0  64 0    10 1 0 0
            1040603 10406 0 1 0  47 0    18 0 0 0
            1040604 10406 0 0 0  50 0    10 0 0 0
            1040605 10406 0 1 0  34 0    25 0 0 0
            1040606 10406 0 0 0  44 0    10 0 0 0
            1040607 10406 0 0 1  34 0     2 0 0 0
            1040608 10406 0 0 0  36 0    20 0 0 0
            1040609 10406 1 1 1  41 0    12 0 1 0
            1040610 10406 0 1 0  45 0    10 0 0 0
            1040701 10407 1 0 1  40 0 16.33 0 0 0
            1040702 10407 0 1 0  57 0    19 1 0 0
            1040703 10407 1 0 0  50 0    23 1 0 0
            1040704 10407 1 0 0  34 1    11 1 0 0
            1040705 10407 1 0 0  30 0    10 1 0 0
            1040706 10407 1 0 0  50 0    11 0 0 0
            1040707 10407 0 1 1  44 0    21 0 0 0
            1040708 10407 0 1 1  64 0    25 0 0 0
            1040709 10407 0 1 1  31 0   4.5 0 1 0
            1040710 10407 1 0 1  42 0    15 1 0 0
            1040801 10408 0 1 0  47 0     3 1 0 1
            1040802 10408 0 0 1  54 1     4 0 0 0
            1040803 10408 0 0 0  70 0 28.33 0 1 0
            1040804 10408 0 0 1  44 0 16.25 0 0 0
            1040805 10408 0 1 0  48 0     4 0 1 0
            1040806 10408 0 0 0  65 0     3 0 0 0
            1040807 10408 1 0 0  68 0   2.2 0 0 0
            1040808 10408 0 0 1  27 0     4 0 1 0
            1040809 10408 0 0 0  42 0   4.5 0 1 0
            1040810 10408 1 0 0  56 0  6.33 0 0 0
            1040901 10409 0 0 0  38 0   2.5 0 1 0
            1040902 10409 1 1 0  56 0     5 0 0 0
            1040903 10409 1 1 0  51 0   2.5 0 0 0
            1040904 10409 1 1 0  60 0     3 0 0 0
            1040905 10409 1 0 1  53 0   1.5 0 0 0
            1040906 10409 0 0 0  43 0     6 0 1 0
            1040907 10409 0 1 0  50 0     2 0 0 0
            1040908 10409 0 0 0  30 0     4 0 0 0
            1040909 10409 0 0 0  60 0     3 0 0 0
            1040910 10409 0 0 0  35 0     1 0 0 0
            1041001 10410 0 1 0  24 0     2 0 0 0
            1041002 10410 1 0 1  28 0     6 0 1 0
            1041003 10410 1 0 0  57 0    25 0 1 0
            1041004 10410 1 0 0  74 0    30 0 0 0
            1041005 10410 0 0 0  40 0    20 0 0 0
            1041006 10410 1 0 0  39 0    15 1 1 0
            1041007 10410 0 0 0  50 0     5 0 0 0
            1041008 10410 0 0 1  33 0    12 1 0 0
            1041009 10410 1 0 0  30 0  1.33 0 0 0
            1041010 10410 1 0 0  67 0    30 0 0 0
            1050101 10501 1 0 1  68 2     6 0 0 0
            1050102 10501 1 1 1  50 2     7 0 0 0
            1050103 10501 1 1 1  75 1   7.5 1 0 0
            1050104 10501 0 0 1  35 2     3 0 0 0
            1050105 10501 1 0 0  45 2   7.5 1 0 0
            1050106 10501 0 1 1  72 0  23.5 1 1 0
            1050107 10501 0 0 0  62 0  17.5 0 0 0
            1050108 10501 1 1 1  64 2   2.5 0 0 0
            1050109 10501 1 1 1  72 2    25 0 0 0
            1050110 10501 1 0 1  65 2    25 0 0 1
            1050201 10502 1 0 0  42 0     8 0 0 0
            1050202 10502 1 0 0  56 0    20 0 0 0
            1050203 10502 1 1 1  52 2     5 0 0 0
            1050204 10502 1 0 1  54 1    29 0 0 0
            1050205 10502 1 1 0  75 1  5.25 0 0 0
            1050206 10502 1 0 0  57 0    30 0 0 0
            1050207 10502 0 0 0  64 0    25 1 0 1
            1050208 10502 0 1 0  60 1   4.5 0 0 0
            1050209 10502 1 1 0  34 2  17.5 0 0 0
            1050301 10503 1 1 1  67 1    30 0 0 1
            1050302 10503 0 1 0  61 0     6 0 0 0
            1050303 10503 1 0 0  60 0   9.5 1 1 0
            1050304 10503 1 1 1  33 0   5.5 1 0 0
            1050305 10503 1 1 1  57 0    27 0 0 1
            1050306 10503 1 1 0  41 0     3 0 0 0
            1050307 10503 1 0 0  67 0     6 0 0 0
            1050308 10503 1 0 0  70 0    25 0 0 1
            1050309 10503 1 1 1  39 1   5.5 1 0 0
            1050310 10503 1 1 0  42 0     5 1 0 0
            1050401 10504 1 1 1  34 0    14 1 1 0
            1050402 10504 0 1 1  67 1 21.33 1 0 1
            1050403 10504 1 0 0  38 0    20 0 0 0
            1050404 10504 1 1 1  46 2     6 0 0 0
            1050405 10504 1 1 1  82 2     3 0 0 1
            1050406 10504 1 1 1  69 0    20 0 1 1
            1050407 10504 0 1 1  42 2    12 0 0 1
            1050408 10504 1 0 0  57 2    14 0 0 0
            1050409 10504 0 0 0  64 0    10 0 1 0
            1050410 10504 1 0 0  60 1 21.33 0 0 0
            1050501 10505 0 0 1  42 0  17.5 1 0 0
            1050502 10505 1 1 1  34 2     7 0 0 0
            1050503 10505 0 1 1  56 0  12.5 0 0 0
            1050504 10505 0 1 1  35 2   5.5 0 0 0
            1050505 10505 0 0 0  58 0    20 1 0 0
            1050506 10505 1 1 1  37 3   4.5 1 0 0
            1050507 10505 1 1 1  94 3  27.5 0 0 1
            1050508 10505 1 0 0  52 0  7.67 0 0 0
            1050509 10505 1 1 1  35 2     3 1 0 0
            1050510 10505 1 0 0  43 2  13.5 0 0 0
            1050601 10506 1 0 0  42 0     6 0 0 1
            1050602 10506 1 1 0  69 1    23 0 0 0
            1050603 10506 1 0 0  48 2     5 0 0 0
            1050604 10506 1 0 0  61 0    25 0 0 0
            1050605 10506 1 1 0  60 1     5 1 0 0
            1050606 10506 1 0 0  40 0    20 0 0 1
            1050607 10506 1 1 0  43 0     9 0 0 1
            1050608 10506 1 1 1  45 0     6 0 0 1
            1050609 10506 0 1 1  62 0    28 0 0 0
            1050610 10506 0 0 0  70 0    10 0 0 0
            1050701 10507 1 0 0  45 2    18 0 0 0
            1050702 10507 0 1 1  62 0    14 0 0 0
            1050703 10507 0 1 1  44 0    16 1 0 0
            1050704 10507 0 1 1  40 0    15 0 0 0
            1050705 10507 0 0 1  68 0     3 0 0 0
            1050706 10507 0 1 1  23 0     4 0 0 0
            1050707 10507 0 1 1  42 0     9 0 0 0
            1050708 10507 0 1 1  68 0     9 0 0 0
            1050709 10507 1 0 0  48 2    28 0 0 0
            1050710 10507 0 0 1  27 0     5 1 0 0
            1050801 10508 1 1 1  63 3  18.5 0 0 0
            1050802 10508 1 1 0  48 2     6 1 0 0
            1050803 10508 1 0 1  50 1    11 0 0 0
            1050804 10508 1 0 0  44 1    10 0 0 0
            1050805 10508 1 1 1  38 1    15 0 0 0
            1050806 10508 1 0 0  44 3  12.5 1 0 0
            1050807 10508 0 0 1  64 0    10 0 0 0
            1050808 10508 1 1 1  45 2  6.33 0 0 0
            1050809 10508 1 1 1  38 2     5 1 0 0
            1050810 10508 1 1 1  37 2   7.5 1 0 0
            1050901 10509 1 1 1  48 0    26 1 0 1
            1050902 10509 0 1 0  32 0     5 0 0 0
            1050903 10509 0 1 1  27 0    15 1 0 0
            1050904 10509 0 1 1  25 0    12 1 0 1
            1050905 10509 0 0 1  25 0     2 0 0 1
            1050906 10509 0 0 1  45 0   7.5 1 0 0
            1050907 10509 0 1 1  53 0    20 1 0 1
            1050908 10509 0 1 0  45 0 19.33 1 0 0
            1050909 10509 0 0 1  20 0     7 0 0 0
            1050910 10509 0 0 0  57 0    23 1 0 0
            1051001 10510 1 1 1  55 0    14 0 0 0
            1051002 10510 0 0 1  73 0    38 0 0 0
            1051003 10510 1 1 0  35 1    15 1 0 0
            1051004 10510 1 0 0  48 0    17 1 0 0
            1051005 10510 1 1 1  36 1    20 1 0 1
            1051006 10510 0 1 1  47 0    20 0 0 0
            1051007 10510 0 0 1  40 1    25 0 1 1
            1051008 10510 1 1 0  32 1    14 1 0 0
            1051009 10510 0 0 0  26 0     2 0 1 0
            1051010 10510 0 0 0  55 0    20 0 0 0
            1051101 10511 1 0 0  64 0     2 0 0 0
            1051102 10511 1 0 1  52 2   6.5 0 0 0
            1051103 10511 1 1 1  47 1     5 1 0 0
            1051104 10511 1 1 1  65 2   5.4 0 1 1
            1051105 10511 1 0 0  50 3     2 0 0 0
            1051106 10511 1 0 0  50 2  3.33 0 0 0
            1051107 10511 1 1 0  48 2  4.25 0 0 0
            1051108 10511 0 1 1  39 0   2.5 0 0 0
            1051109 10511 0 0 0  48 0     2 0 0 0
            1051110 10511 1 1 1  49 0  3.67 0 0 0
            1051301 10513 0 1 1  31 0     4 0 0 0
            1051302 10513 1 1 1  29 1  4.67 0 1 0
            1051303 10513 0 0 1  65 1    12 0 0 1
            1051304 10513 0 1 1  44 0   3.5 0 0 0
            1051305 10513 0 1 1  49 0     4 1 0 0
            1051306 10513 0 1 1  50 0    15 1 0 0
            1051307 10513 0 0 0  48 0     3 1 0 0
            1051308 10513 1 0 0  81 2  6.67 0 0 0
            1051309 10513 0 0 0  70 0     6 0 0 0
            1051310 10513 0 0 0  38 0     3 0 0 0
            1051401 10514 0 0 0  42 1    10 0 0 0
            1051402 10514 1 1 1  36 0    10 1 1 0
            1051403 10514 1 0 1  28 1     5 0 1 0
            1051404 10514 0 1 1  70 0     7 0 0 0
            1051405 10514 1 0 0  26 1     5 0 0 0
            1051406 10514 1 1 1  40 0   3.5 0 0 0
            1051407 10514 0 1 1  66 0     3 1 1 0
            1051408 10514 0 0 0  23 1     3 0 1 0
            1051409 10514 1 0 1  34 2     7 1 0 0
            1051501 10515 0 1 0  36 0  16.5 0 0 1
            1051502 10515 0 1 0  70 0    30 0 0 1
            1051503 10515 1 1 0  57 0    20 0 0 0
            1051504 10515 0 1 1  22 0     5 0 1 0
            1051505 10515 0 1 1  26 0     2 1 0 0
            1051506 10515 1 1 0  58 0    10 0 1 0
            1051507 10515 0 0 0  55 0     4 0 0 0
            1051508 10515 0 0 1  57 1    25 0 0 0
            1051509 10515 0 0 0  65 0    38 0 0 0
            1051510 10515 1 0 0  58 1     5 0 0 0
            1051601 10516 0 0 0  55 0  6.75 0 0 0
            1051602 10516 0 0 0  37 0   1.5 1 0 0
            1051603 10516 1 1 0  55 0     4 1 1 0
            1051604 10516 0 0 0  46 0     7 1 0 0
            1051605 10516 1 1 0  58 0   3.5 1 0 0
            1051606 10516 0 0 0  50 0     4 0 0 0
            1051607 10516 0 0 0  72 0     8 0 0 0
            1051608 10516 1 1 0  41 0    11 1 0 0
            1051609 10516 1 0 0  38 0     5 1 0 0
            1051610 10516 0 0 0  41 0     5 1 0 0
            1051701 10517 1 0 0  46 0    12 0 0 0
            1051702 10517 1 0 0  61 1  6.67 0 0 0
            1051703 10517 1 1 0  36 0    20 0 0 0
            1051704 10517 1 1 0  28 0    10 0 0 0
            1051705 10517 1 0 0  53 0  5.33 0 0 0
            1051706 10517 1 1 1  60 2    30 0 0 0
            1051707 10517 1 1 0  42 1  3.67 0 0 0
            1051708 10517 0 1 0  41 0     6 0 1 0
            1051709 10517 1 0 0  44 0    10 0 0 0
            1051710 10517 0 1 0  49 0    30 1 0 0
            1051901 10519 0 0 0  45 0   5.5 0 0 0
            1051902 10519 0 0 0  40 0     8 0 0 0
            1051903 10519 1 1 1  49 0    15 1 0 0
            1051904 10519 0 0 0  45 0     3 0 0 0
            1051905 10519 0 1 0  75 0   3.5 0 0 0
            1051906 10519 0 0 0  74 0  3.75 0 0 0
            1051907 10519 0 0 1  33 0     3 0 0 0
            1051908 10519 0 1 1  70 0     3 0 0 0
            1051909 10519 0 0 0  82 0    13 0 0 0
            1051910 10519 0 0 0  49 0     4 0 0 0
            1060101 10601 1 1 0  30 0    10 1 0 0
            1060102 10601 0 1 0  72 0     1 0 1 0
            1060103 10601 1 1 0  41 0     3 1 0 0
            1060104 10601 0 0 1  37 0  4.67 0 0 0
            1060105 10601 0 0 0  40 0     5 0 1 0
            1060106 10601 0 0 0  30 0    14 0 0 0
            1060107 10601 0 0 0  25 0     7 1 0 0
            1060108 10601 0 0 1  40 0   6.5 0 0 0
            1060109 10601 0 0 0  45 0    20 0 0 0
            1060110 10601 1 1 0  32 0     7 1 0 0
            1060201 10602 1 1 1  40 0    15 0 0 0
            1060202 10602 1 0 0  48 1  1.67 0 0 0
            1060203 10602 1 1 0  45 1   3.5 0 0 0
            1060204 10602 1 0 0  36 0    20 1 0 0
            1060205 10602 0 0 0  65 0   3.5 0 0 0
            1060206 10602 1 0 1  85 0    10 0 0 0
            1060207 10602 1 1 0  44 0 11.33 1 0 0
            1060208 10602 1 0 0  60 0    19 1 0 0
            1060209 10602 1 0 0  51 0 21.67 0 0 0
            1060210 10602 0 0 0  67 0    21 0 0 0
            1060401 10604 1 0 0  45 0 10.67 1 0 0
            1060402 10604 0 0 1  60 0   2.5 0 0 0
            1060403 10604 0 1 0  63 0     6 0 0 0
            1060404 10604 1 0 1  55 0  14.5 0 0 0
            1060405 10604 0 1 0  82 0    10 0 0 0
            1060406 10604 1 1 0  50 0    10 0 0 0
            1060407 10604 1 1 0  55 1  2.33 0 0 0
            1060408 10604 0 0 0  60 0  14.5 0 0 0
            1060409 10604 1 1 0  42 0  5.33 0 1 0
            1060410 10604 0 0 0  41 0   3.5 0 0 0
            1060501 10605 1 1 0  37 0    22 0 0 0
            1060502 10605 1 0 0  30 0     6 0 0 0
            1060503 10605 1 1 0  71 0     3 0 0 0
            1060504 10605 1 1 0  47 0    10 0 0 0
            1060505 10605 0 0 0  52 0     8 0 0 0
            1060506 10605 1 0 0  50 0   2.5 0 0 0
            1060507 10605 1 0 0  64 0  2.25 0 0 0
            1060508 10605 1 0 0  61 0    20 0 0 0
            1060509 10605 1 0 0  33 0    20 1 0 0
            1060510 10605 1 0 1  44 0    25 0 0 0
            1060601 10606 0 1 1  32 0     4 0 0 0
            1060602 10606 0 1 0  50 0     5 0 0 0
            1060603 10606 0 0 0  35 0   1.5 0 0 0
            1060604 10606 1 1 0  54 0    25 0 0 0
            1060605 10606 0 1 0  58 0    10 0 0 0
            1060606 10606 0 1 0  70 0     3 0 0 0
            1060607 10606 0 0 0  65 0     2 0 0 0
            1060608 10606 0 0 0  50 1     2 0 0 0
            1060609 10606 0 0 0  34 0     4 1 0 0
            1060610 10606 0 1 0  46 0     3 0 0 0
            1061301 10613 0 0 0  40 0  9.33 1 0 0
            1061302 10613 0 0 0  28 0     7 0 0 0
            1061303 10613 0 0 0  27 0    15 0 0 0
            1061304 10613 1 0 0  36 0    25 1 0 0
            1061305 10613 0 0 0  47 0    12 0 0 0
            1061306 10613 1 0 1  39 0    15 1 0 0
            1061307 10613 1 0 0  54 0  17.5 0 0 0
            1061308 10613 0 1 0  46 0     7 0 0 0
            1061309 10613 1 0 0  34 0     2 1 0 0
            1061310 10613 1 0 0  64 0  22.5 0 0 0
            1070101 10701 0 0 0  26 0     5 1 1 0
            1070102 10701 0 0 1  27 0     4 0 0 0
            1070103 10701 0 1 1  69 0    15 0 0 1
            1070104 10701 1 1 1  50 1 16.67 0 0 1
            1070105 10701 0 1 1  45 0    15 0 0 1
            1070106 10701 0 0 1  62 0    35 0 0 0
            1070107 10701 0 0 1  31 0    13 1 0 0
            1070108 10701 0 1 1  32 0    10 1 0 0
            1070109 10701 0 0 1  30 0    10 0 0 0
            1070110 10701 0 1 0  68 0     8 0 0 0
            1070201 10702 0 0 1  80 2     4 0 0 0
            1070202 10702 0 1 0  53 2     8 0 0 0
            1070203 10702 0 0 0  52 0     7 0 0 0
            1070204 10702 1 0 0  45 3     9 0 0 0
            1070205 10702 0 1 0  79 1  32.5 0 0 0
            1070206 10702 1 1 0  37 1  9.33 1 0 0
            1070207 10702 0 1 1  62 0 12.33 0 0 0
            1070208 10702 0 0 0  42 0   8.5 0 0 0
            1070209 10702 1 1 0  52 2  6.67 0 0 0
            1070210 10702 0 0 0  52 0     6 0 0 0
            1070301 10703 0 1 0  32 1     4 1 0 0
            1070302 10703 0 1 0  50 1    20 0 0 0
            1070303 10703 1 0 0  38 0   9.5 1 0 0
            1070304 10703 0 1 0  66 0    20 1 0 1
            1070305 10703 0 0 0  41 0    12 1 0 0
            1070306 10703 1 0 0  30 2     7 1 0 0
            1070307 10703 0 1 1  25 0     3 1 0 0
            1070308 10703 0 0 0  44 2     6 0 0 0
            1070309 10703 0 0 0  41 0    10 1 0 0
            1070310 10703 1 1 1  54 1  6.75 0 0 0
            1070401 10704 1 1 0  41 1    10 1 0 0
            1070402 10704 0 0 0  42 0     8 1 0 0
            1070403 10704 0 1 1  45 0    20 1 0 0
            1070404 10704 0 1 1  27 0    10 1 0 1
            1070405 10704 0 1 0  35 1     8 1 0 0
            1070406 10704 1 0 0  29 2     4 0 0 0
            1070407 10704 0 0 1  47 0    20 0 0 0
            1070408 10704 1 1 0  61 2 16.25 0 0 0
            1070409 10704 0 0 0  42 0     9 1 0 0
            1070410 10704 0 0 0  30 0     6 0 0 0
            1070501 10705 0 1 0  62 1    10 0 0 0
            1070502 10705 0 0 1  35 0    10 1 0 0
            1070503 10705 1 1 1  58 1  1.67 0 1 0
            1070504 10705 0 0 1  46 0    25 1 0 0
            1070505 10705 0 1 0  61 0     8 0 0 0
            1070506 10705 1 0 0  54 2    30 0 0 0
            1070507 10705 1 0 1  53 0   6.5 0 0 0
            1070508 10705 0 1 0  60 0   4.5 0 0 1
            1070509 10705 1 1 1  34 1    10 0 0 0
            1070510 10705 0 1 0  40 2     5 0 0 0
            1070701 10707 0 0 1  26 0   3.5 1 0 0
            1070702 10707 0 0 0  40 0   3.5 0 0 0
            1070703 10707 0 0 1  41 0   6.5 1 0 0
            1070704 10707 0 0 0  32 0     7 1 0 0
            1070705 10707 1 0 0  34 0   2.5 0 0 0
            1070706 10707 1 1 0  50 0   3.5 0 0 0
            1070707 10707 0 0 1  40 0   7.5 0 0 0
            1070708 10707 0 0 1  70 0  8.67 0 0 0
            1070709 10707 0 1 0  32 0     6 1 0 0
            1070710 10707 0 0 0  54 0     3 0 1 0
            1070801 10708 1 1 1  48 1    15 0 0 0
            1070802 10708 0 1 0  38 0  12.5 1 0 1
            1070803 10708 1 0 1  40 2     7 0 0 0
            1070804 10708 1 0 0  31 1   8.5 0 0 0
            1070805 10708 0 1 1  27 0  7.33 1 0 0
            1070806 10708 1 1 0  27 1   5.5 0 1 1
            1070807 10708 0 0 0  52 0  21.5 0 0 0
            1070808 10708 1 1 1  31 0     8 0 0 0
            1070809 10708 0 1 1  35 1    13 0 0 0
            1070810 10708 1 1 1  33 1    15 1 0 1
            1080101 10801 0 0 0  47 0    18 0 0 0
            1080102 10801 0 0 1  65 0    15 0 0 1
            1080103 10801 0 0 1  58 0    25 0 1 0
            1080104 10801 1 0 0  36 0    24 1 0 0
            1080105 10801 0 0 0  40 3    19 0 1 0
            1080106 10801 1 0 0  43 0    25 0 1 0
            1080107 10801 0 0 0  49 0    13 0 1 0
            1080108 10801 0 0 0  32 0    25 0 0 0
            1080109 10801 0 0 0  26 0     1 0 1 0
            1080110 10801 0 0 0  47 0     6 1 0 1
            1080201 10802 0 0 0  39 3    20 1 0 0
            1080202 10802 0 0 0  34 0    10 0 1 0
            1080203 10802 1 0 0  29 3    11 1 1 0
            1080204 10802 1 0 0  54 0    35 0 0 0
            1080205 10802 1 0 1  33 0     7 0 1 0
            1080206 10802 0 1 1  45 0    20 0 1 0
            1080207 10802 0 0 0  38 0    14 0 0 0
            1080208 10802 1 1 0  57 0    10 0 0 0
            1080209 10802 0 1 1  35 0    11 0 0 0
            1080210 10802 0 1 1  33 0    20 0 1 0
            1080301 10803 1 1 1  35 0     7 0 0 0
            1080302 10803 1 1 0  35 0    12 0 1 0
            1080303 10803 1 0 1  44 0     8 0 0 0
            1080304 10803 1 0 0  55 0     8 0 0 0
            1080305 10803 0 0 0  48 0     7 0 0 0
            1080306 10803 0 0 0  69 0     6 0 0 0
            1080307 10803 0 0 0  32 0   9.5 1 0 0
            1080308 10803 1 1 0  36 0     6 0 0 0
            1080309 10803 0 0 0  50 0    35 0 0 0
            1080310 10803 1 1 1  34 0     5 1 0 0
            1080401 10804 1 1 0  29 0    12 1 0 0
            1080402 10804 1 0 0  56 0    15 1 0 0
            1080403 10804 0 0 0  24 0    10 1 0 0
            1080404 10804 0 0 0  54 0    22 0 0 0
            1080405 10804 0 0 0  49 0    10 0 0 0
            1080406 10804 0 0 1  49 0    13 0 0 0
            1080407 10804 0 0 1  46 0    15 1 1 0
            1080408 10804 1 0 1  27 0    15 0 1 0
            1080409 10804 0 0 0  51 0  31.5 0 0 0
            1080410 10804 0 0 0  61 0  13.5 0 0 0
            1080501 10805 1 1 1  38 1    20 0 0 0
            1080502 10805 1 1 0  45 0    37 1 0 0
            1080503 10805 0 1 0  35 0    17 0 0 0
            1080504 10805 0 1 0  31 0    10 1 0 0
            1080505 10805 1 0 0  70 0    30 1 0 0
            1080506 10805 1 0 1  33 0    17 0 0 0
            1080507 10805 1 0 0  44 1    15 1 0 0
            1080508 10805 0 0 0  25 0    12 0 1 0
            1080509 10805 0 0 1  30 0    12 1 0 0
            1080510 10805 1 0 0  42 0    20 1 0 0
            1080601 10806 1 0 0  47 0    31 0 0 0
            1080602 10806 0 0 0  44 0     7 0 1 0
            1080603 10806 0 1 0  41 0     3 0 0 0
            1080604 10806 0 1 1  43 0     3 0 1 0
            1080605 10806 0 1 0  41 0    10 0 0 0
            1080606 10806 0 0 0  27 0   5.5 0 0 0
            1080607 10806 1 1 1  38 0 14.33 1 1 0
            1080608 10806 0 1 0  47 0    21 0 0 0
            1080609 10806 1 1 1  39 0    14 0 1 0
            1080610 10806 0 0 1  39 0  24.5 1 1 0
            1080701 10807 1 0 0  42 0     7 0 0 0
            1080702 10807 1 0 1  42 0    20 0 0 0
            1080703 10807 1 0 1  28 0     5 0 0 0
            1080704 10807 0 1 0  44 0    25 0 0 0
            1080705 10807 0 0 0  39 0    30 0 0 0
            1080706 10807 1 1 0  43 0    15 0 0 0
            1080707 10807 1 0 0  65 0    30 0 0 1
            1080708 10807 1 0 1  45 0    10 0 0 0
            1080709 10807 1 0 0  47 0    13 0 0 0
            1080710 10807 1 0 1  29 0    10 1 0 0
            1090101 10901 1 1 1  54 0   6.5 0 0 0
            1090102 10901 1 0 0  50 0     2 1 0 0
            1090103 10901 1 1 1  41 0     8 0 0 0
            1090104 10901 0 0 0  51 0     7 0 0 0
            1090105 10901 0 0 0  50 0     4 1 0 0
            1090106 10901 1 0 0  50 1    20 1 0 0
            1090107 10901 0 0 1  65 0     4 0 0 0
            1090108 10901 0 0 1  34 0  9.67 1 0 0
            1090109 10901 0 0 0  50 0     5 0 0 0
            1090110 10901 1 0 1  54 0  20.5 0 0 0
            1090201 10902 0 1 1  34 0    10 0 1 0
            1090202 10902 0 0 0  34 0    20 0 0 0
            end

            Comment


            • #7
              First, thank you for an excellent presentation of your prior work! It is greatly appreciated.

              Let me make what may be a foolish observation.

              You describe your models as
              Code:
              y1 = f1(y2, y3, x1, x2, x3)
              y2 = f2(x3, x5, x6)
              y3 = f3(x4, x7, x8)
              Your models in Stata appear to be
              Code:
              y1 = f1(y2, y3, x1, x2, x3)
              y2 = f2(x2, x3, x4, x5)
              y3 = f3(x2, x3, x4, x6)
              Your models in LIMDEP appear to be
              Code:
              y1 = f1(y2, y3, x1, x2, x3)
              y2 = f2(x1, x2, x3, x4, x5)
              y3 = f3(x2, x3, x4, x6)
              The models for y1 are identical and match your description. The models for y3 are identical but do not match your description. The models for y2 are not identical - they have different numbers of independent variables - and neither matches your description.

              I'm not familiar enough with using either Stata or LIMDEP for these purposes to know if these inconsistencies indicate a problem.

              Comment


              • #8
                William, thanks for pointing out the error and apologies for the mix-up. The right specification is:

                y1 = f1(y2, y3, x1, x2, x3)
                y2 = f2(x1, x2, x3, x4, x5)
                y3 = f3(x2, x3, x4, x6)

                However, the substantive issues remain. Below are the Stata results from the identical model estimated in Limdep:

                Code:
                Multivariate probit (MSL, # draws = 200)          Number of obs   =        650
                                                                  Wald chi2(14)   =     248.37
                Log pseudolikelihood = -1241.6481                 Prob > chi2     =     0.0000
                
                                                   (Std. Err. adjusted for 66 clusters in vid)
                ------------------------------------------------------------------------------
                             |               Robust
                             |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
                -------------+----------------------------------------------------------------
                y1           |
                          y2 |   1.343666    .234929     5.72   0.000     .8832137    1.804118
                          y3 |   -.465239   .7326024    -0.64   0.525    -1.901113    .9706353
                          x1 |    -.00879   .0037455    -2.35   0.019    -.0161311    -.001449
                          x2 |   .2877144   .1807015     1.59   0.111    -.0664541    .6418829
                          x3 |   .0162344   .0078329     2.07   0.038     .0008822    .0315867
                       _cons |  -.1612781   .3041286    -0.53   0.596    -.7573591    .4348029
                -------------+----------------------------------------------------------------
                y2           |
                          x1 |   .0075814   .0042735     1.77   0.076    -.0007944    .0159573
                          x2 |   .1548884   .0846257     1.83   0.067    -.0109749    .3207516
                          x3 |  -.0071789   .0064509    -1.11   0.266    -.0198224    .0054645
                          x4 |   .1732857   .0972146     1.78   0.075    -.0172514    .3638228
                          x5 |   .0102722   .1315351     0.08   0.938    -.2475319    .2680762
                       _cons |  -.6934089   .2054128    -3.38   0.001    -1.096011   -.2908072
                -------------+----------------------------------------------------------------
                y3           |
                          x2 |   .1702344   .0881082     1.93   0.053    -.0024546    .3429234
                          x3 |   .0032607   .0058996     0.55   0.580    -.0083022    .0148237
                          x4 |   .1644255   .1287772     1.28   0.202    -.0879732    .4168241
                          x6 |   .4566752   .2159169     2.12   0.034     .0334859    .8798645
                       _cons |   -.554203   .1180065    -4.70   0.000    -.7854915   -.3229145
                -------------+----------------------------------------------------------------
                    /atrho21 |  -1.052254   .6801192    -1.55   0.122    -2.385263    .2807557
                -------------+----------------------------------------------------------------
                    /atrho31 |   .0235009   .4724996     0.05   0.960    -.9025812    .9495831
                -------------+----------------------------------------------------------------
                    /atrho32 |   .4372824   .0815021     5.37   0.000     .2775413    .5970236
                -------------+----------------------------------------------------------------
                       rho21 |  -.7826809   .2634854    -2.97   0.003    -.9831906    .2736043
                -------------+----------------------------------------------------------------
                       rho31 |   .0234966   .4722387     0.05   0.960    -.7175524    .7395942
                -------------+----------------------------------------------------------------
                       rho32 |   .4113893   .0677086     6.08   0.000      .270628    .5349282
                ------------------------------------------------------------------------------
                Likelihood ratio test of  rho21 = rho31 = rho32 = 0:  
                             chi2(3) =  42.7101   Prob > chi2 = 0.0000
                The Limdep results again are:

                Code:
                --> Sample    ;all $
                --> Skip $
                --> Mprobit    ;Lhs=y1,y2,y3
                            ;eq1=y2,y3,x1,x2,x3,one
                            ;eq2=x1,x2,x3,x4,x5,one
                            ;eq3=x2,x3,x4,x6,one
                            ;pts=200;cluster=vid $
                Normal exit from iterations. Exit status=0.
                
                +---------------------------------------------+
                | Multivariate Probit Model:  3 equations.    |
                | Maximum Likelihood Estimates                |
                | Model estimated: Mar 17, 2016 at 00:29:05AM.|
                | Dependent variable             MVProbit     |
                | Weighting variable                 None     |
                | Number of observations              650     |
                | Iterations completed                 33     |
                | Log likelihood function       -1242.182     |
                | Number of parameters                 20     |
                | Info. Criterion: AIC =          3.88364     |
                |   Finite Sample: AIC =          3.88569     |
                | Info. Criterion: BIC =          4.02139     |
                | Info. Criterion:HQIC =          3.93707     |
                | Replications for simulated probs. = 200     |
                +---------------------------------------------+
                +---------------------------------------------------------------------+
                | Covariance matrix for the model is adjusted for data clustering.    |
                | Sample of    650 observations contained     66 clusters defined by  |
                | variable VID      which identifies by a value a cluster ID.         |
                | Sample of    650 observations contained      1 strata defined by    |
                |    650 observations (fixed number) in each stratum.                 |
                +---------------------------------------------------------------------+
                +--------+--------------+----------------+--------+--------+----------+
                |Variable| Coefficient  | Standard Error |b/St.Er.|P[|Z|>z]| Mean of X|
                +--------+--------------+----------------+--------+--------+----------+
                ---------+Index function for Y1
                 Y2      |     .84344025      5.74939391      .147   .8834    .38000000
                 Y3      |     .42013852      3.25958960      .129   .8974    .35538462
                 X1      |    -.00806300       .00860756     -.937   .3489   46.6107692
                 X2      |     .31793308       .71794537      .443   .6579    .34307692
                 X3      |     .01468727       .00840065     1.748   .0804   11.2224103
                 Constant|    -.31411643       .77762634     -.404   .6863
                ---------+Index function for Y2
                 X1      |     .00779023       .00104284     7.470   .0000   46.6107692
                 X2      |     .16831606       .00775319    21.709   .0000    .34307692
                 X3      |    -.00663025       .00072659    -9.125   .0000   11.2224103
                 X4      |     .14847431       .14394530     1.031   .3023    .28923077
                 X5      |     .04544672       .05689766      .799   .4244    .11538462
                 Constant|    -.70426701       .08723116    -8.074   .0000
                ---------+Index function for Y3
                 X2      |     .17303656       .01161279    14.901   .0000    .34307692
                 X3      |     .00272569       .00050864     5.359   .0000   11.2224103
                 X4      |     .19051601       .05574615     3.418   .0006    .28923077
                 X6      |     .42854019       .04664493     9.187   .0000    .07692308
                 Constant|    -.55390573       .03198934   -17.315   .0000
                ---------+Correlation coefficients
                 R(01,02)|    -.58312487      3.20681595     -.182   .8557
                 R(01,03)|    -.44115615      1.23356392     -.358   .7206
                 R(02,03)|     .40497242       .00399374   101.402   .0000

                Comment


                • #9
                  The Stata code that yielded the above result is:
                  Code:
                  set seed 1003
                  mvprobit (y1 y2 y3 x1 x2 x3) (y2 x1 x2 x3 x4 x5) (y3 x2 x3 x4 x6), cl(vid) dr(200) nolog

                  Comment


                  • #10
                    I think William has identified at least part of the problem. Further I am not convinced that Stata and Limdep really are giving very different results. Here is my reasoning:

                    To match up with Limdep, it seems like the Stata command should be

                    Code:
                    mvprobit (y1 y2 y3 x1 x2 x3) (y2 x1 x2 x3 x4 x5) (y3 x2 x3 x4 x6), cl(vid) dr(200) nolog
                    Beyond that, I'll note
                    • The log likelihoods are very similar (-1241.6481 Stata, -1242.182 Limdep)
                    • The coefficients (from the command I give above) don't look that radically different to me from the Limdep coefficients
                    • rho21 and rho32 aren't that radically different either.
                    Also, if you want to set the seed, I think you are supposed to use the seed option on mvprobit, e.g.

                    Code:
                    mvprobit (y1 y2 y3 x1 x2 x3) (y2 x1 x2 x3 x4 x5) (y3 x2 x3 x4 x6), cl(vid) dr(200) nolog seed(1200)
                    If you change the seed, the mvprobit results also change a bit. For example with the last command I got Log pseudolikelihood = -1242.3931. Hence, even if you were only using Stata, you wouldn't get the exact same results unless you used the exact same seed each time.

                    All of this leads me to suspect that, when you really do specify the exact same model in Stata and Limdep, the differences are not that great. And to the extent they do exist, it may be because of differences in the way that random numbers are being generated. If you could get both programs to generate the same random numbers, maybe the results would be identical. But i don't know how you could do that.

                    I also wonder if this particular model is a bit unstable; when I used seed(1003) I got

                    could not calculate numerical derivatives
                    flat or discontinuous region encountered
                    r(430);

                    In conclusion, I think Fred should make sure the same model really is being estimated by both commands. Beyond that I don't think he should expect a perfect match. If it seems like the differences are just too great to be plausible, try different seeds and see if they match up better. Or see if there is any way in Limdep to generate random numbers the same way Stata does.
                    Last edited by Richard Williams; 16 Mar 2016, 20:56. Reason: EDIT: Fred posted while I was still writing, but I believe his results confirm what I just said.
                    -------------------------------------------
                    Richard Williams, Notre Dame Dept of Sociology
                    Stata Version: 17.0 MP (2 processor)

                    EMAIL: [email protected]
                    WWW: https://www3.nd.edu/~rwilliam

                    Comment


                    • #11
                      Dear Richard,
                      Thanks a lot for taking time to look at this and responding. I have looked at the situation more carefully and carefully re-estimated identical models and still think there are substantial differences to worry about. I see that the models are more stable when not including y2 and y3 in the first equation. Thus, for comparison I estimated the following equation:

                      y1 = f1(x1, x2, x3)
                      y2 = f2(x1, x2, x3, x4, x5)
                      y3 = f3(x2, x3, x4, x6)

                      The codes for Stata and Limdep respectively are:

                      ------------------ Stata ---------------------

                      Code:
                      mvprobit (y1 x1 x2 x3) (y2 x1 x2 x3 x4 x5) (y3 x2 x3 x4 x6), cl(vid) dr(200) nolog seed (1003)
                      ------------------ Stata Results ---------------------

                      Code:
                      . mvprobit (y1 x1 x2 x3) (y2 x1 x2 x3 x4 x5) (y3 x2 x3 x4 x6), cl(vid) dr(200) nolog seed (1003)
                      
                      Multivariate probit (MSL, # draws = 200)          Number of obs   =        650
                                                                        Wald chi2(12)   =      35.40
                      Log pseudolikelihood = -1242.8793                 Prob > chi2     =     0.0004
                      
                                                         (Std. Err. adjusted for 66 clusters in vid)
                      ------------------------------------------------------------------------------
                                   |               Robust
                                   |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
                      -------------+----------------------------------------------------------------
                      y1           |
                                x1 |   -.006999   .0039644    -1.77   0.077     -.014769     .000771
                                x2 |   .4740413   .1350116     3.51   0.000     .2094234    .7386592
                                x3 |   .0156802     .00763     2.06   0.040     .0007258    .0306347
                             _cons |   .0458113   .1980885     0.23   0.817     -.342435    .4340576
                      -------------+----------------------------------------------------------------
                      y2           |
                                x1 |   .0076917   .0043844     1.75   0.079    -.0009016     .016285
                                x2 |   .1741295   .0836713     2.08   0.037     .0101367    .3381223
                                x3 |  -.0060159   .0065581    -0.92   0.359    -.0188696    .0068377
                                x4 |   .1035748   .1188717     0.87   0.384    -.1294095    .3365592
                                x5 |   .0982916   .1298504     0.76   0.449    -.1562105    .3527936
                             _cons |  -.6988311   .2147174    -3.25   0.001    -1.119669   -.2779928
                      -------------+----------------------------------------------------------------
                      y3           |
                                x2 |    .175141   .0839626     2.09   0.037     .0105774    .3397046
                                x3 |   .0032237   .0059273     0.54   0.587    -.0083936     .014841
                                x4 |   .1668598   .1146241     1.46   0.145    -.0577992    .3915188
                                x6 |   .4567178    .210278     2.17   0.030     .0445805    .8688551
                             _cons |  -.5554719   .1151227    -4.83   0.000    -.7811083   -.3298355
                      -------------+----------------------------------------------------------------
                          /atrho21 |   .0115043   .0772112     0.15   0.882    -.1398269    .1628355
                      -------------+----------------------------------------------------------------
                          /atrho31 |  -.0480755   .0712076    -0.68   0.500    -.1876398    .0914888
                      -------------+----------------------------------------------------------------
                          /atrho32 |   .4273434    .079645     5.37   0.000      .271242    .5834448
                      -------------+----------------------------------------------------------------
                             rho21 |   .0115038    .077201     0.15   0.882    -.1389227    .1614114
                      -------------+----------------------------------------------------------------
                             rho31 |  -.0480385   .0710433    -0.68   0.499    -.1854682    .0912344
                      -------------+----------------------------------------------------------------
                             rho32 |   .4030988   .0667036     6.04   0.000     .2647801    .5251647
                      ------------------------------------------------------------------------------
                      Likelihood ratio test of  rho21 = rho31 = rho32 = 0:  
                                   chi2(3) =  40.9908   Prob > chi2 = 0.0000
                      -------------- Limdep Code --------------------

                      Code:
                      Sample    ;all $
                      Skip $
                      Mprobit    ;Lhs=y1,y2,y3
                              ;eq1=x1,x2,x3,one
                              ;eq2=x1,x2,x3,x4,x5,one
                              ;eq3=x2,x3,x4,x6,one                
                              ;pts=200;cluster=vid $

                      ----------------- Limdep Results -----------------

                      Code:
                      Normal exit from iterations. Exit status=0.
                      
                      +---------------------------------------------+
                      | Multivariate Probit Model:  3 equations.    |
                      | Maximum Likelihood Estimates                |
                      | Model estimated: Mar 17, 2016 at 03:53:29AM.|
                      | Dependent variable             MVProbit     |
                      | Weighting variable                 None     |
                      | Number of observations              650     |
                      | Iterations completed                 25     |
                      | Log likelihood function       -1243.514     |
                      | Number of parameters                 18     |
                      | Info. Criterion: AIC =          3.88158     |
                      |   Finite Sample: AIC =          3.88325     |
                      | Info. Criterion: BIC =          4.00556     |
                      | Info. Criterion:HQIC =          3.92967     |
                      | Replications for simulated probs. = 200     |
                      +---------------------------------------------+
                      +---------------------------------------------------------------------+
                      | Covariance matrix for the model is adjusted for data clustering.    |
                      | Sample of    650 observations contained     66 clusters defined by  |
                      | variable VID      which identifies by a value a cluster ID.         |
                      | Sample of    650 observations contained      1 strata defined by    |
                      |    650 observations (fixed number) in each stratum.                 |
                      +---------------------------------------------------------------------+
                      +--------+--------------+----------------+--------+--------+----------+
                      |Variable| Coefficient  | Standard Error |b/St.Er.|P[|Z|>z]| Mean of X|
                      +--------+--------------+----------------+--------+--------+----------+
                      ---------+Index function for Y1
                       X1      |    -.00701141       .00160895    -4.358   .0000   46.6107692
                       X2      |     .47399357       .01551811    30.545   .0000    .34307692
                       X3      |     .01568666       .00054639    28.709   .0000   11.2224103
                       Constant|     .04628356       .08830084      .524   .6002
                      ---------+Index function for Y2
                       X1      |     .00768607       .00013325    57.682   .0000   46.6107692
                       X2      |     .17364558       .01301582    13.341   .0000    .34307692
                       X3      |    -.00601253       .00014366   -41.852   .0000   11.2224103
                       X4      |     .10537902       .00099697   105.699   .0000    .28923077
                       X5      |     .09936561       .00193122    51.452   .0000    .11538462
                       Constant|    -.69937761       .00729828   -95.828   .0000
                      ---------+Index function for Y3
                       X2      |     .17618484       .00372186    47.338   .0000    .34307692
                       X3      |     .00313226       .00014590    21.468   .0000   11.2224103
                       X4      |     .16844893       .00273264    61.643   .0000    .28923077
                       X6      |     .45816316       .00205320   223.146   .0000    .07692308
                       Constant|    -.55608418       .00080160  -693.715   .0000
                      ---------+Correlation coefficients
                       R(01,02)|     .00980709       .00137373     7.139   .0000
                       R(01,03)|    -.04914228       .00252476   -19.464   .0000
                       R(02,03)|     .39879742       .00144968   275.094   .0000
                      While in this case the coefficients are largely identical the standard errors are different by a wide margin so that all the Limdep estimates are statistically different from zero, not the case for the Stata results. This means one would reach vastly different conclusions using the same dataset but different statistical packages and this worries me. I am not suggesting the results should be exactly the same but such wide differences must be worrying unless I am still doing something very wrong.

                      Comment


                      • #12
                        Does Limdep have an option for robust standard errors?
                        Last edited by Richard Williams; 16 Mar 2016, 22:33.
                        -------------------------------------------
                        Richard Williams, Notre Dame Dept of Sociology
                        Stata Version: 17.0 MP (2 processor)

                        EMAIL: [email protected]
                        WWW: https://www3.nd.edu/~rwilliam

                        Comment


                        • #13
                          Yes, the codes are identical in this case. I specify
                          cl(vid) in Stata which is equivalent to

                          cluster=vid in Limdep.

                          Comment


                          • #14
                            Well, fitting the model with gsem seems to corroborate mvprobit over LIMDEP.

                            .ÿversionÿ14.1

                            .ÿ
                            .ÿclearÿ*

                            .ÿsetÿmoreÿoff

                            .ÿ
                            .ÿquietlyÿinputÿlongÿidÿintÿvidÿbyte(y1ÿy2ÿy3ÿx1ÿx2)ÿfloatÿx3ÿbyte(x4ÿx5ÿx6)

                            .ÿ
                            .ÿgsemÿ(y1ÿ<-ÿx1ÿx2ÿx3ÿF1@1,ÿprobit)ÿ///
                            >ÿÿÿÿÿ(y2ÿ<-ÿx1ÿx2ÿx3ÿx4ÿx5ÿF2@1,ÿprobit)ÿ///
                            >ÿÿÿÿÿÿÿÿÿ(y3ÿ<-ÿx2ÿx3ÿx4ÿx6ÿF3@1,ÿprobit),ÿ///
                            >ÿÿÿÿÿÿÿÿÿvariance(F1@1ÿF2@1ÿF3@1)ÿ///
                            >ÿÿÿÿÿÿÿÿÿcovariance(F1*F2ÿF1*F3ÿF2*F3)ÿ///
                            >ÿÿÿÿÿÿÿÿÿvce(clusterÿvid)ÿnocnsreportÿnodvheaderÿnolog

                            GeneralizedÿstructuralÿequationÿmodelÿÿÿÿÿÿÿÿÿÿÿNumberÿofÿobsÿÿÿÿÿ=ÿÿÿÿÿÿÿÿ650
                            Logÿpseudolikelihoodÿ=ÿ-1242.8218

                            ÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿ(Std.ÿErr.ÿadjustedÿforÿ66ÿclustersÿinÿvid)
                            ------------------------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿÿÿÿÿ|ÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿRobust
                            ÿÿÿÿÿÿÿÿÿÿÿÿÿ|ÿÿÿÿÿÿCoef.ÿÿÿStd.ÿErr.ÿÿÿÿÿÿzÿÿÿÿP>|z|ÿÿÿÿÿ[95%ÿConf.ÿInterval]
                            -------------+----------------------------------------------------------------
                            y1ÿ<-ÿÿÿÿÿÿÿÿ|
                            ÿÿÿÿÿÿÿÿÿÿx1ÿ|ÿÿ-.0099387ÿÿÿ.0056126ÿÿÿÿ-1.77ÿÿÿ0.077ÿÿÿÿ-.0209391ÿÿÿÿ.0010618
                            ÿÿÿÿÿÿÿÿÿÿx2ÿ|ÿÿÿ.6703581ÿÿÿ.1909002ÿÿÿÿÿ3.51ÿÿÿ0.000ÿÿÿÿÿ.2962006ÿÿÿÿ1.044515
                            ÿÿÿÿÿÿÿÿÿÿx3ÿ|ÿÿÿ.0221996ÿÿÿ.0107933ÿÿÿÿÿ2.06ÿÿÿ0.040ÿÿÿÿÿ.0010451ÿÿÿÿ.0433542
                            ÿÿÿÿÿÿÿÿÿÿF1ÿ|ÿÿÿÿÿÿÿÿÿÿ1ÿÿ(constrained)
                            ÿÿÿÿÿÿÿ_consÿ|ÿÿÿÿ.066338ÿÿÿ.2802607ÿÿÿÿÿ0.24ÿÿÿ0.813ÿÿÿÿ-.4829628ÿÿÿÿ.6156388
                            -------------+----------------------------------------------------------------
                            y2ÿ<-ÿÿÿÿÿÿÿÿ|
                            ÿÿÿÿÿÿÿÿÿÿx1ÿ|ÿÿÿÿ.010885ÿÿÿ.0062008ÿÿÿÿÿ1.76ÿÿÿ0.079ÿÿÿÿ-.0012683ÿÿÿÿ.0230382
                            ÿÿÿÿÿÿÿÿÿÿx2ÿ|ÿÿÿ.2451702ÿÿÿÿÿ.11817ÿÿÿÿÿ2.07ÿÿÿ0.038ÿÿÿÿÿ.0135613ÿÿÿÿ.4767791
                            ÿÿÿÿÿÿÿÿÿÿx3ÿ|ÿÿ-.0085094ÿÿÿ.0092751ÿÿÿÿ-0.92ÿÿÿ0.359ÿÿÿÿ-.0266883ÿÿÿÿ.0096694
                            ÿÿÿÿÿÿÿÿÿÿx4ÿ|ÿÿÿ.1483031ÿÿÿ.1682906ÿÿÿÿÿ0.88ÿÿÿ0.378ÿÿÿÿ-.1815403ÿÿÿÿ.4781466
                            ÿÿÿÿÿÿÿÿÿÿx5ÿ|ÿÿÿ.1401936ÿÿÿ.1834441ÿÿÿÿÿ0.76ÿÿÿ0.445ÿÿÿÿ-.2193502ÿÿÿÿ.4997375
                            ÿÿÿÿÿÿÿÿÿÿF2ÿ|ÿÿÿÿÿÿÿÿÿÿ1ÿÿ(constrained)
                            ÿÿÿÿÿÿÿ_consÿ|ÿÿ-.9892853ÿÿÿ.3036812ÿÿÿÿ-3.26ÿÿÿ0.001ÿÿÿÿÿ-1.58449ÿÿÿ-.3940811
                            -------------+----------------------------------------------------------------
                            y3ÿ<-ÿÿÿÿÿÿÿÿ|
                            ÿÿÿÿÿÿÿÿÿÿx2ÿ|ÿÿÿÿ.248106ÿÿÿ.1186782ÿÿÿÿÿ2.09ÿÿÿ0.037ÿÿÿÿÿÿ.015501ÿÿÿÿ.4807111
                            ÿÿÿÿÿÿÿÿÿÿx3ÿ|ÿÿÿ.0044109ÿÿÿ.0083846ÿÿÿÿÿ0.53ÿÿÿ0.599ÿÿÿÿ-.0120226ÿÿÿÿ.0208443
                            ÿÿÿÿÿÿÿÿÿÿx4ÿ|ÿÿÿ.2358778ÿÿÿ.1620883ÿÿÿÿÿ1.46ÿÿÿ0.146ÿÿÿÿ-.0818095ÿÿÿÿ.5535651
                            ÿÿÿÿÿÿÿÿÿÿx6ÿ|ÿÿÿÿ.644986ÿÿÿ.2969305ÿÿÿÿÿ2.17ÿÿÿ0.030ÿÿÿÿÿ.0630129ÿÿÿÿ1.226959
                            ÿÿÿÿÿÿÿÿÿÿF3ÿ|ÿÿÿÿÿÿÿÿÿÿ1ÿÿ(constrained)
                            ÿÿÿÿÿÿÿ_consÿ|ÿÿ-.7837501ÿÿÿ.1628827ÿÿÿÿ-4.81ÿÿÿ0.000ÿÿÿÿ-1.102994ÿÿÿ-.4645058
                            -------------+----------------------------------------------------------------
                            ÿÿÿÿÿÿvar(F1)|ÿÿÿÿÿÿÿÿÿÿ1ÿÿ(constrained)
                            ÿÿÿÿÿÿvar(F2)|ÿÿÿÿÿÿÿÿÿÿ1ÿÿ(constrained)
                            ÿÿÿÿÿÿvar(F3)|ÿÿÿÿÿÿÿÿÿÿ1ÿÿ(constrained)
                            -------------+----------------------------------------------------------------
                            ÿÿÿcov(F2,F1)|ÿÿÿÿ.024902ÿÿÿ.1552591ÿÿÿÿÿ0.16ÿÿÿ0.873ÿÿÿÿ-.2794003ÿÿÿÿ.3292042
                            ÿÿÿcov(F3,F1)|ÿÿ-.1052024ÿÿÿ.1426721ÿÿÿÿ-0.74ÿÿÿ0.461ÿÿÿÿ-.3848345ÿÿÿÿ.1744297
                            ÿÿÿcov(F3,F2)|ÿÿÿÿ.807617ÿÿÿ.1343455ÿÿÿÿÿ6.01ÿÿÿ0.000ÿÿÿÿÿ.5443046ÿÿÿÿ1.070929
                            ------------------------------------------------------------------------------

                            .ÿ
                            .ÿtempnameÿroot2

                            .ÿscalarÿdefineÿ`root2'ÿ=ÿsqrt(2)

                            .ÿforeachÿcoeffÿinÿx1ÿx2ÿx3ÿ_consÿ{
                            ÿÿ2.ÿÿÿÿÿlincomÿ_b[y1:`coeff']ÿ/ÿ`root2'
                            ÿÿ3.ÿ}

                            ÿ(ÿ1)ÿÿ.7071068*[y1]x1ÿ=ÿ0

                            ------------------------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿÿÿÿÿ|ÿÿÿÿÿÿCoef.ÿÿÿStd.ÿErr.ÿÿÿÿÿÿzÿÿÿÿP>|z|ÿÿÿÿÿ[95%ÿConf.ÿInterval]
                            -------------+----------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿ(1)ÿ|ÿÿ-.0070277ÿÿÿ.0039687ÿÿÿÿ-1.77ÿÿÿ0.077ÿÿÿÿ-.0148062ÿÿÿÿ.0007508
                            ------------------------------------------------------------------------------

                            ÿ(ÿ1)ÿÿ.7071068*[y1]x2ÿ=ÿ0

                            ------------------------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿÿÿÿÿ|ÿÿÿÿÿÿCoef.ÿÿÿStd.ÿErr.ÿÿÿÿÿÿzÿÿÿÿP>|z|ÿÿÿÿÿ[95%ÿConf.ÿInterval]
                            -------------+----------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿ(1)ÿ|ÿÿÿ.4740147ÿÿÿ.1349868ÿÿÿÿÿ3.51ÿÿÿ0.000ÿÿÿÿÿ.2094455ÿÿÿÿÿ.738584
                            ------------------------------------------------------------------------------

                            ÿ(ÿ1)ÿÿ.7071068*[y1]x3ÿ=ÿ0

                            ------------------------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿÿÿÿÿ|ÿÿÿÿÿÿCoef.ÿÿÿStd.ÿErr.ÿÿÿÿÿÿzÿÿÿÿP>|z|ÿÿÿÿÿ[95%ÿConf.ÿInterval]
                            -------------+----------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿ(1)ÿ|ÿÿÿ.0156975ÿÿÿÿ.007632ÿÿÿÿÿ2.06ÿÿÿ0.040ÿÿÿÿÿÿ.000739ÿÿÿÿÿ.030656
                            ------------------------------------------------------------------------------

                            ÿ(ÿ1)ÿÿ.7071068*[y1]_consÿ=ÿ0

                            ------------------------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿÿÿÿÿ|ÿÿÿÿÿÿCoef.ÿÿÿStd.ÿErr.ÿÿÿÿÿÿzÿÿÿÿP>|z|ÿÿÿÿÿ[95%ÿConf.ÿInterval]
                            -------------+----------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿ(1)ÿ|ÿÿÿÿ.046908ÿÿÿ.1981742ÿÿÿÿÿ0.24ÿÿÿ0.813ÿÿÿÿ-.3415063ÿÿÿÿ.4353223
                            ------------------------------------------------------------------------------

                            .ÿ
                            .ÿforeachÿcoeffÿinÿx1ÿx2ÿx3ÿx4ÿx5ÿ_consÿ{
                            ÿÿ2.ÿÿÿÿÿlincomÿ_b[y2:`coeff']ÿ/ÿ`root2'
                            ÿÿ3.ÿ}

                            ÿ(ÿ1)ÿÿ.7071068*[y2]x1ÿ=ÿ0

                            ------------------------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿÿÿÿÿ|ÿÿÿÿÿÿCoef.ÿÿÿStd.ÿErr.ÿÿÿÿÿÿzÿÿÿÿP>|z|ÿÿÿÿÿ[95%ÿConf.ÿInterval]
                            -------------+----------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿ(1)ÿ|ÿÿÿ.0076968ÿÿÿ.0043846ÿÿÿÿÿ1.76ÿÿÿ0.079ÿÿÿÿ-.0008968ÿÿÿÿ.0162905
                            ------------------------------------------------------------------------------

                            ÿ(ÿ1)ÿÿ.7071068*[y2]x2ÿ=ÿ0

                            ------------------------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿÿÿÿÿ|ÿÿÿÿÿÿCoef.ÿÿÿStd.ÿErr.ÿÿÿÿÿÿzÿÿÿÿP>|z|ÿÿÿÿÿ[95%ÿConf.ÿInterval]
                            -------------+----------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿ(1)ÿ|ÿÿÿ.1733615ÿÿÿ.0835588ÿÿÿÿÿ2.07ÿÿÿ0.038ÿÿÿÿÿ.0095893ÿÿÿÿ.3371337
                            ------------------------------------------------------------------------------

                            ÿ(ÿ1)ÿÿ.7071068*[y2]x3ÿ=ÿ0

                            ------------------------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿÿÿÿÿ|ÿÿÿÿÿÿCoef.ÿÿÿStd.ÿErr.ÿÿÿÿÿÿzÿÿÿÿP>|z|ÿÿÿÿÿ[95%ÿConf.ÿInterval]
                            -------------+----------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿ(1)ÿ|ÿÿ-.0060171ÿÿÿ.0065585ÿÿÿÿ-0.92ÿÿÿ0.359ÿÿÿÿ-.0188715ÿÿÿÿ.0068373
                            ------------------------------------------------------------------------------

                            ÿ(ÿ1)ÿÿ.7071068*[y2]x4ÿ=ÿ0

                            ------------------------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿÿÿÿÿ|ÿÿÿÿÿÿCoef.ÿÿÿStd.ÿErr.ÿÿÿÿÿÿzÿÿÿÿP>|z|ÿÿÿÿÿ[95%ÿConf.ÿInterval]
                            -------------+----------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿ(1)ÿ|ÿÿÿ.1048662ÿÿÿ.1189994ÿÿÿÿÿ0.88ÿÿÿ0.378ÿÿÿÿ-.1283684ÿÿÿÿ.3381007
                            ------------------------------------------------------------------------------

                            ÿ(ÿ1)ÿÿ.7071068*[y2]x5ÿ=ÿ0

                            ------------------------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿÿÿÿÿ|ÿÿÿÿÿÿCoef.ÿÿÿStd.ÿErr.ÿÿÿÿÿÿzÿÿÿÿP>|z|ÿÿÿÿÿ[95%ÿConf.ÿInterval]
                            -------------+----------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿ(1)ÿ|ÿÿÿ.0991319ÿÿÿ.1297146ÿÿÿÿÿ0.76ÿÿÿ0.445ÿÿÿÿÿ-.155104ÿÿÿÿ.3533678
                            ------------------------------------------------------------------------------

                            ÿ(ÿ1)ÿÿ.7071068*[y2]_consÿ=ÿ0

                            ------------------------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿÿÿÿÿ|ÿÿÿÿÿÿCoef.ÿÿÿStd.ÿErr.ÿÿÿÿÿÿzÿÿÿÿP>|z|ÿÿÿÿÿ[95%ÿConf.ÿInterval]
                            -------------+----------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿ(1)ÿ|ÿÿ-.6995304ÿÿÿ.2147351ÿÿÿÿ-3.26ÿÿÿ0.001ÿÿÿÿ-1.120403ÿÿÿ-.2786574
                            ------------------------------------------------------------------------------

                            .ÿ
                            .ÿforeachÿcoeffÿinÿx2ÿx3ÿx4ÿx6ÿ_consÿ{
                            ÿÿ2.ÿÿÿÿÿlincomÿ_b[y3:`coeff']ÿ/ÿ`root2'
                            ÿÿ3.ÿ}

                            ÿ(ÿ1)ÿÿ.7071068*[y3]x2ÿ=ÿ0

                            ------------------------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿÿÿÿÿ|ÿÿÿÿÿÿCoef.ÿÿÿStd.ÿErr.ÿÿÿÿÿÿzÿÿÿÿP>|z|ÿÿÿÿÿ[95%ÿConf.ÿInterval]
                            -------------+----------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿ(1)ÿ|ÿÿÿ.1754375ÿÿÿ.0839182ÿÿÿÿÿ2.09ÿÿÿ0.037ÿÿÿÿÿ.0109609ÿÿÿÿ.3399141
                            ------------------------------------------------------------------------------

                            ÿ(ÿ1)ÿÿ.7071068*[y3]x3ÿ=ÿ0

                            ------------------------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿÿÿÿÿ|ÿÿÿÿÿÿCoef.ÿÿÿStd.ÿErr.ÿÿÿÿÿÿzÿÿÿÿP>|z|ÿÿÿÿÿ[95%ÿConf.ÿInterval]
                            -------------+----------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿ(1)ÿ|ÿÿÿÿ.003119ÿÿÿ.0059288ÿÿÿÿÿ0.53ÿÿÿ0.599ÿÿÿÿ-.0085013ÿÿÿÿ.0147392
                            ------------------------------------------------------------------------------

                            ÿ(ÿ1)ÿÿ.7071068*[y3]x4ÿ=ÿ0

                            ------------------------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿÿÿÿÿ|ÿÿÿÿÿÿCoef.ÿÿÿStd.ÿErr.ÿÿÿÿÿÿzÿÿÿÿP>|z|ÿÿÿÿÿ[95%ÿConf.ÿInterval]
                            -------------+----------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿ(1)ÿ|ÿÿÿ.1667908ÿÿÿ.1146138ÿÿÿÿÿ1.46ÿÿÿ0.146ÿÿÿÿ-.0578481ÿÿÿÿ.3914296
                            ------------------------------------------------------------------------------

                            ÿ(ÿ1)ÿÿ.7071068*[y3]x6ÿ=ÿ0

                            ------------------------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿÿÿÿÿ|ÿÿÿÿÿÿCoef.ÿÿÿStd.ÿErr.ÿÿÿÿÿÿzÿÿÿÿP>|z|ÿÿÿÿÿ[95%ÿConf.ÿInterval]
                            -------------+----------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿ(1)ÿ|ÿÿÿÿ.456074ÿÿÿ.2099616ÿÿÿÿÿ2.17ÿÿÿ0.030ÿÿÿÿÿ.0445569ÿÿÿÿ.8675911
                            ------------------------------------------------------------------------------

                            ÿ(ÿ1)ÿÿ.7071068*[y3]_consÿ=ÿ0

                            ------------------------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿÿÿÿÿ|ÿÿÿÿÿÿCoef.ÿÿÿStd.ÿErr.ÿÿÿÿÿÿzÿÿÿÿP>|z|ÿÿÿÿÿ[95%ÿConf.ÿInterval]
                            -------------+----------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿ(1)ÿ|ÿÿÿ-.554195ÿÿÿ.1151755ÿÿÿÿ-4.81ÿÿÿ0.000ÿÿÿÿ-.7799348ÿÿÿ-.3284552
                            ------------------------------------------------------------------------------

                            .ÿ
                            .ÿforeachÿpairÿinÿ"F2,ÿF1"ÿ"F3,ÿF1"ÿ"F3,ÿF2"ÿ{
                            ÿÿ2.ÿÿÿÿÿlincomÿ_b[cov(`pair'):_cons]ÿ/ÿ2
                            ÿÿ3.ÿ}

                            ÿ(ÿ1)ÿÿ.5*[cov(F2,F1)]_consÿ=ÿ0

                            ------------------------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿÿÿÿÿ|ÿÿÿÿÿÿCoef.ÿÿÿStd.ÿErr.ÿÿÿÿÿÿzÿÿÿÿP>|z|ÿÿÿÿÿ[95%ÿConf.ÿInterval]
                            -------------+----------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿ(1)ÿ|ÿÿÿÿ.012451ÿÿÿ.0776296ÿÿÿÿÿ0.16ÿÿÿ0.873ÿÿÿÿ-.1397001ÿÿÿÿ.1646021
                            ------------------------------------------------------------------------------

                            ÿ(ÿ1)ÿÿ.5*[cov(F3,F1)]_consÿ=ÿ0

                            ------------------------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿÿÿÿÿ|ÿÿÿÿÿÿCoef.ÿÿÿStd.ÿErr.ÿÿÿÿÿÿzÿÿÿÿP>|z|ÿÿÿÿÿ[95%ÿConf.ÿInterval]
                            -------------+----------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿ(1)ÿ|ÿÿ-.0526012ÿÿÿÿ.071336ÿÿÿÿ-0.74ÿÿÿ0.461ÿÿÿÿ-.1924172ÿÿÿÿ.0872148
                            ------------------------------------------------------------------------------

                            ÿ(ÿ1)ÿÿ.5*[cov(F3,F2)]_consÿ=ÿ0

                            ------------------------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿÿÿÿÿ|ÿÿÿÿÿÿCoef.ÿÿÿStd.ÿErr.ÿÿÿÿÿÿzÿÿÿÿP>|z|ÿÿÿÿÿ[95%ÿConf.ÿInterval]
                            -------------+----------------------------------------------------------------
                            ÿÿÿÿÿÿÿÿÿ(1)ÿ|ÿÿÿ.4038085ÿÿÿ.0671728ÿÿÿÿÿ6.01ÿÿÿ0.000ÿÿÿÿÿ.2721523ÿÿÿÿ.5354647
                            ------------------------------------------------------------------------------

                            .ÿ
                            .ÿexit

                            endÿofÿdo-file


                            .

                            Comment


                            • #15
                              Fred: I don't have anything much extra to add at this stage. I was unable to contribute further yesterday due to other commitments. I just want to acknowledge and thank you for providing the additional information and in the form requested. I would also like to acknowledge and salute the substantial contributions from Rich Williams and Joseph Coveney. Good point by Rich about needing to specify the seed (I don't know what choices LIMDEP allows there). See the Cappellari/Jenkins article on mvprobit in SJ 2003 for more about sensitivity of results to seed choice. Interesting stuff from Joseph about gsem estimates! Is gsem using Halton draws? A couple of further options for Fred to look at in order to compare his results with LIMDEP: (1) look at the Cappellari/Jenkins article in Stata Journal 6(2), 2006, (free download), with associated code which can be used to fit MV probit and other models requiring evaluation of multivariate normal integrals. We provide a plugin too (helps speed). This allows Halton draws too. (2) Have a look at cmp by David Roodman (on SSC). It can also fit MV probit models, using MSL and Halton draws.

                              Comment

                              Working...
                              X