Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Mean

    Hi all,

    I am running a simulation in STATA and would like to obtain a single mean out of the code below. In particular the code displays 100 values given by
    Code:
    (y_test`i'-(bx_1_`i'*x_1_test`i'+ bx_2_`i'*x_2_test`i'+bx_3_`i'*x_3_test`i'+bx_4_`i'*x_4_test`i'))^2
    . What I would like to do is to make the mean (values/100) and display it at the end of the for loop...can you please help me?

    Code:
    forvalues i = 1/100{
    gen bx_2_`i'= 0
    gen bx_3_`i'= 0
    gen bx_4_`i'= 0
    quietly reg y_tilde_train_eq`i' x_1_train x_2_train x_3_train x_4_train 
    matrix coefficienti_`i' = e(b)
    gen bx_1_`i'= coefficienti_`i'[1,1]
    replace bx_2_`i'= coefficienti_`i'[1,2]
    replace bx_3_`i'= coefficienti_`i'[1,3]
    replace bx_4_`i'= coefficienti_`i'[1,4]
    display (y_test`i'-(bx_1_`i'*x_1_test`i'+ bx_2_`i'*x_2_test`i'+bx_3_`i'*x_3_test`i'+bx_4_`i'*x_4_test`i'))^2
    
    }
    Many thanks!

  • #2
    Can we clarify what you want here? Your code carries out 100 "training" regressions and then applies the resulting regression model to a set of _test predictors and a _test outcome. You then calculate the difference between the value of y_test and the "predicted" value (using the saved coefficients applied to the new x_*_test variables) in the first observation of the data set only, and displays that. Why do you want to emphasize the fit of the model only in the first observation of the data set? Is there something special about that one that warrants special treatment? (Maybe there is, and that's fine, but I'm asking because this would be unusual and it is more likely you are under a mistaken impression of what your -display- command is going to show you).

    Is your question that you actually don't want the results from the first observation, but rather the average result over all the observations to be -display-ed? (That is not going to be the average of 100 observations, but the average of however many observations are actually in your data set, but it has nothing to do with i going from 1 to 100, unless by coincidence you have exactly 100 observations in your data.) Also, is there any reason you are saving the coefficients from 100 different regressions both in 100 matrices and in 100 different sets of 4 variables? It seems to me that at most you need to save either the matrix or the variables, but not both. Maybe you don't even need either once you have calculated the expression you want to -display-?

    Comment


    • #3
      So, sorry for not having been enough clear,

      basically I have 10 training sets and 10 test sets. What I am doing is just regressing the training equations and saving their coefficients in
      bx_1_`i', bx_2_`i'bx_3_`i'bx_4_`i'. Finally I use these coefficients in order to compute the squared error:
      Code:
         
       (y_test`i'-(bx_1_`i'*x_1_test`i'+ bx_2_`i'*x_2_test`i'+bx_3_`i'*x_3_test`i'+bx_4_`i'*x_4_test`i'))^2
      Of course I have generated, out of the loop, 100 y_test expressions. My questions is the following: the result of the above code produces this:
      Code:
      .0675106 .0348595 .00395895 .0483873 .00201037 .00181968 .00648204 .00145363 .00020433 .07100827 .01489778 .00016131 .01093546 .13251386 .06262481 .12954758 .01696025 .00006467 .01733305 ...
      100 times, which are basically the result of :
      Code:
         
       display (y_test`i'-(bx_1_`i'*x_1_test`i'+ bx_2_`i'*x_2_test`i'+bx_3_`i'*x_3_test`i'+bx_4_`i'*x_4_test`i'))^2
      What I would like to do is the average of these values produced by the display command...is there a way I can do it? Thank you very much!

      Comment


      • #4
        Sorry, I'm even more confused now.

        Please show:

        1. An example of your starting data using the -dataex- command. Be sure to include test and training data that are related to each other. (You do not have to show every training and test set, just a couple will be enough. But make sure that the test data you show are intended to be used with the training data you show.) Also please indicate the exact number of observations that each test and training data set has in your complete data.

        2. If the code shown in #1 is not exactly the code you are running, please now post the exact code you are running.

        3. Please show the exact and complete output you are getting from the exact code you are running.

        4. Explain what you expect the numbers you are getting from the -display- command to represent. As best I can tell, they are each calculated from only on a single observation in the data, so I question if they are really useful, and, therefore, I also question how the mean of these 100 numbers would be meaningful.

        I am unable to see how the code you show in #1 can produce the results shown in #3 if there are 10 test and 10 training data sets--so I am missing something. With example data and exact code, I can try it myself and see what's going on, and then I can try to help you. I could easily show you code that will just get you the mean of the 100 numbers you are getting--but as I am unable to make sense of those 100 numbers, I prefer not to do that. I want to understand what is going on, and then help you get the mean of whatever the numbers you really need are (which may turn out to be the same 100 you have now.)



        Comment


        • #5
          Ok so. I am sorry for that, I must have been clearer. So the training and the test sets are actually 100, not 10 (I was wrong before).

          1) the training set is just here supposed to provide the coefficients b which I will use in order to compute the squared error. I will show up just 3 y_tests for sake of clarity with the related training sets used to compute the coefficients b:

          Code:
          * Example generated by -dataex-. To install: ssc install dataex
          clear
          input float(y_tilde_test_1 y_tilde_test_2 y_tilde_test_3 y_tilde_train_eq1 y_tilde_train_eq2 y_tilde_train_eq3)
           14.316642  17.311266  30.35335  -5.678215  -9.667219  -7.507145
            5.598597  14.668008 23.375605   4.293957  4.4811826  2.3048542
            -9.63813   23.89645  23.66499 -2.4407635  1.1786336   .5785112
           17.427608   24.89078 33.969574   8.144305   2.480196   3.431017
             8.23625  12.654618  24.25597    3.51475   2.638379  1.1128054
           1.4499726   25.44029 23.838974  -5.233748  -14.30823  -7.273131
           -4.805116   .6380095 10.707998  15.801363  16.188463  13.750708
            8.726369  18.947676 16.280273   3.094641   6.364878   1.208039
           9.2333555   21.46032 33.081985  14.344335   4.943889   9.007656
           3.5836806  17.208447 23.288963 -1.0660337 -1.5517355   .9560136
           4.6732235  32.408894 13.686934  1.1027112 -3.6934524   .9682107
            3.007774    24.8413 31.485973  -7.492452  -8.702329  -5.253339
           -1.842861   21.01994 15.208338  -5.773968   -.629715 -1.6544924
           -.7880877    11.3722 15.207412  1.6765296   3.190321   2.760694
            8.826948  2.6163034  16.63315   3.187879   4.423594   6.210862
            6.143428   12.83803   23.4867  -9.381782  -10.09383 -10.015352
            2.910725  18.781612 16.369644  11.440895  13.473182   9.933634
          -.01910355   6.190617 17.099686  11.351033  12.988806  10.388487
            5.869545  20.391165 18.962946   .3842752   .5830765 -2.1133516
             8.42517 -1.9617815 32.297283   4.000456   9.030897   8.532361
           1.0256265    14.3256  25.49604  -4.411503  -7.468643  -5.015834
            7.220366   7.819638   23.9109   1.555078  1.9309056   3.902295
           .05553389  37.629707  24.09683  2.0271823   5.637525   8.452129
            6.899018   7.036071 22.266565  1.9859043   4.046828   .9068498
            5.162345   22.00786  12.35621    10.6897   5.484067    7.59752
            5.767989  21.819736 21.400476 -4.1847405  -2.972531 -1.1257273
           13.469367   12.43255  29.83628 -3.0618155  -2.948586  -6.785152
           4.7499595   22.64454 27.957144  -.9241264 -2.2141244  -3.170983
             5.67865  30.125263 19.049372   7.079779    8.41538   6.414203
            7.258149   8.874355 22.437277  17.782312  13.653555  17.293983
            6.426909  10.216143  20.95597   6.921942  -.7457681  .12332442
           .16094235   32.72918  26.06899  .05704967   -1.80385  .01099224
          -11.764844  18.904703  34.68201  3.5335085   9.183543  11.838705
            8.353702   21.08388   19.5036  12.797104   9.784642  13.990294
           -3.848245  20.097565  20.65517   6.994652   9.696981   7.772275
           -9.956572   32.50397 23.210983   .3289531  1.0639226   3.671172
          -2.5094004  29.184875  23.93684  1.9706045  1.1190509  -1.098702
            .4762815  14.863487  23.78156   3.292145   4.799315  2.5338364
          -.26653084   9.005189  25.60933  4.0617933   3.920976  3.1350725
            3.236763   6.564818  28.45986 -.06499314 -1.0439804   2.660736
           4.4543114   29.55748  20.69601 -2.2985861   .7914088  -.9309758
           3.4102495   40.47058 15.379848  14.070042  17.171366  14.420473
           1.8300363   21.32685  24.61477   7.839327    9.75312  10.347823
           10.286015   8.091856  26.91678   3.759911 -3.9064496  -.7361298
            8.394764  18.890694  25.97238 -1.0202087   4.174401   .6810671
            2.761771   5.217161 17.867931   9.300408  12.328524  13.310615
           12.791423  16.859251 35.728886  -.9581565  -5.239486 -1.6186153
            9.626434  13.430263 28.582075  1.1727715   3.219203   6.048301
          -1.5476307   31.53469  26.67514  1.9203595    7.12482    4.30014
            6.452813  18.394133  25.35271  11.444448  11.546082  10.092876
           15.746616   14.31916 31.467327  1.8559272   .6462361  1.7869537
           -4.184621  30.954933  27.11541   5.372938  2.0195243     6.1769
           19.868666  29.768835 22.629475  11.507782    5.12764   5.896374
           -5.239051    44.5772  25.97957  2.4771886   1.079835 .005654237
           -.5469622   22.28993   19.8599   -5.25029 -3.4062355 -1.8635268
           -.9875136  16.277676 25.397005 -3.0878334 -1.4784638  -.9753633
           4.4574523    28.5917  33.69266   5.422514   5.306523   5.547367
          -1.3698035   44.64806 13.807335   4.930118   5.287953   6.309356
          -4.3625126  22.760817  21.55873   1.021275  4.0190735    5.84141
            6.222983  31.549316  25.50529   5.251795   4.863791   5.657416
            7.622895  23.267767 28.317236   12.06782   7.511736   13.52646
            8.109869  23.411766  16.17102  4.1472735  4.0217743  2.8844235
           11.355505   28.60299  22.20508   -4.39428  -4.262612   -4.64674
           1.2778815   21.19937  20.66929 -3.4711804  -2.019245   1.552003
            3.961429  -.9590269  38.23862   8.761336   3.004428    9.04445
            5.473168  26.087053  10.69578   9.304689   6.941305   4.775054
            6.590372  13.219144 22.721586   8.143937   8.446115   6.214111
          -1.3582414   1.222828  16.30097 -3.5071015  -.4795649  1.1805141
            2.562342   21.41541 21.549406    9.18734   8.985153    9.53186
            .8037827  20.781906  30.20468  13.992144  13.510776   15.61888
            11.07561   26.81649  19.97545   8.889474   11.97317   5.920058
            4.895733  24.808514  19.78355    10.7835  12.420574   8.213948
            9.435315  13.524135 14.472455   6.316013    6.27883  4.4387507
           -.5535343   9.564178  28.43387   2.143365    5.09418   4.285097
            8.567056  16.600428 20.832457   1.242173   3.477687   .4378385
           15.008123   7.831995 19.044157  -.8023006  -4.661063 -2.0424232
            1.164239  1.3158286 29.462486  15.643858   13.30173  19.262196
           1.5156246  15.031084 26.229185  3.0132954   4.894832  -1.483021
            10.03902    15.9664  25.31188  1.0436313  1.8761377  -.5888593
           14.333726   11.56717 23.111736   12.97899   10.28966  11.929094
            6.184639    23.2866  26.61866   7.848447    5.06356   3.579583
            5.904042  29.620165  32.37738  .10223951   .6701006   5.099922
           19.052305   9.720041  10.52034   8.329021   6.889057   6.432611
            -3.77154   36.16204 17.213282   9.530327   8.192802   6.181785
            .6846251  11.574862  23.26403  10.229602   7.363417   5.329079
          -1.0600654  17.580528 29.084124  1.1222844  -.6274094   .8383828
            2.882244   38.71186  28.24893  12.004786  11.074103     11.835
          -1.2960554   24.24663  22.05527  -3.577111 -2.6083755 -2.2985032
           16.355972   15.82321 17.583252   6.577126   3.618231   5.920418
          -4.1072445  11.534518 22.087564  3.8387966   5.416826   6.484909
            3.180989   15.76491  31.22968  -7.840789  -5.799705 -1.5120958
           -10.92465  27.830975  34.53651  2.6258016   5.429988   6.223716
           10.792866    43.7271 16.718235  .16997983 -.30726105  .52864873
           4.1206193   5.764264 19.543476   .9987981  4.4137616   2.752762
            5.415081  16.371178 25.799494   5.298575  2.1479046   4.348016
            22.50936  22.856764 20.451424   7.499469   6.164132   4.115868
            7.483903  15.483662 18.136892 -4.4110785  -5.795539  -5.568447
            6.933421 -13.580204 19.308786   7.894638   9.160413   6.469543
           14.531067   27.87213  31.00574  -2.505311  -.5085866  .09335907
            9.705017  13.339443 16.610966    5.07276   3.819671   3.845308
          end
          The observations in both training and test set are 500.

          2) The code shown in #1 is the one I wrote down but actually does not do what I would like: in particular instead of displaying
          Code:
           
           (y_test`i'-(bx_1_`i'*x_1_test`i'+ bx_2_`i'*x_2_test`i'+bx_3_`i'*x_3_test`i'+bx_4_`i'*x_4_test`i'))^2
          for 100 times, I would like to have the mean of the values found with:
          Code:
           
           (y_test`i'-(bx_1_`i'*x_1_test`i'+ bx_2_`i'*x_2_test`i'+bx_3_`i'*x_3_test`i'+bx_4_`i'*x_4_test`i'))^2
          3) The exact output I am getting by running #1 is the one in #3, that is the result of the display command (a series of 100 squared error of which I would like to compute the mean):

          Code:
           
           .0675106 .0348595 .00395895 .0483873 .00201037 .00181968 .00648204 .00145363 .00020433 .07100827 .01489778 .00016131 .01093546 .13251386 .06262481 .12954758 .01696025 .00006467 .01733305 ...
          4) So summary: I have 100 training and 100 test sets. With the training sets I compute the coefficients which I then use in order to compute the squared error (are the bx_1_`i', bx_2_`i', bx_3_`i' and bx_4_`i'). However in my actual loop (in #1) I use the command display, which just displays the 100 squared errors. What I would like to do is a mean of the displayed squared error (the one in point 3) without of course having to copy-paste them from the output screen of STATA and performing the mean manually.


          Is it better now?
          Sorry again!

          Comment


          • #6
            OK, I think I get it now. You are fitting a regression between y_tilde_train_eq and x1_train x2_train x3_train and x4_train, and then you want to use those coefficients to calculate the mean square residual of y_tilde_test against x1_est x2_test x3_test and x4_test. You have 100 such sets of training and test data, and want the mean square residual from each.

            So, there is a better approach. Like many things in Stata, this will go more smoothly in long layout than wide, with the "sets" stacked "vertically" rather than spread out "horizontally." No need for an explicit loop, and the variable names get simpler.

            The data example you posted does not contain any x variables. So what I've done is create a new data set using your y variables (at least for sets 1 and 2) and generated some random numbers for the x-variables just to demonstrate the code. I dropped the third set because -dataex- can't output a record that wide. I understand your real data has 100 sets: that won't be a problem here as the code below never explicitly refers to the set number, and in all other respects follows your variable naming.

            Code:
            * Example generated by -dataex-. To install: ssc install dataex
            clear
            input float(y_tilde_test_1 y_tilde_train_eq1 x_1_test1 x_1_train1 x_2_test1 x_2_train1 x_3_test1 x_3_train1 x_4_test1 x_4_train1 y_tilde_test_2 y_tilde_train_eq2 x_1_test2 x_1_train2 x_2_test2 x_2_train2 x_3_test2 x_3_train2 x_4_test2 x_4_train2)
             14.316642  -5.678215    .9472316   .50994456  .14044325   .6042549   .7310948   .2699666  .7852077   .5596151  17.311266  -9.667219    .05222337   .5060895   .9587523    .810038   .3303585   .6431971   .6417272   .28304344
              5.598597   4.293957    .9457484    .7252148   .9789942   .5555237   .6849202     .85385  .7425904   .4270143  14.668008  4.4811826    .18564783  .04936334  .15681207   .9884044   .5731694  .20402233   .4913769    .3181391
              -9.63813 -2.4407635    .8825376   .05717023  .39555255   .3636598   .7497337   .3011794  .3420951  .33563355   23.89645  1.1786336     .9440776  .01275296   .6708255   .1770708   .7770451 .018862754  .04190454   .12751144
             17.427608   8.144305    .7505445    .6139446  .23890825   .8641682  .13504642  .22072293  .8382239  .19717167   24.89078   2.480196     .9484983   .0936002   .4753942   .5561476   .1764596   .9122161    .728659    .3164386
               8.23625    3.51475    .4809064    .6263985    .743752   .3854215   .8553503   .3306408  .6578231   .9353576  12.654618   2.638379     .9763448   .3676016   .4282821   .8021439   .6332538 .016922284 .018739017    .6911414
             1.4499726  -5.233748    .7655026    .6288581   .9432667   .7951117   .6639598   .8240759  .7877514   .8633687   25.44029  -14.30823   .035859343  .05571201    .866712  .03111919  .21009286   .3983913   .7967453    .7509952
             -4.805116  15.801363   .21017867    .1729921   .8287497    .914399 .027723813    .689195  .7464123   .3312512   .6380095  16.188463     .6616006   .6544085   .6819412   .3191354  .58379954  .54759485   .3028584    .6972145
              8.726369   3.094641    .9874877    .9159671   .7628713   .8213199   .6977083   .8836725  .9221079   .3301727  18.947676   6.364878     .2985383  .14597209   .7689802  .37677035   .7015807   .8829721   .5340822   .03908192
             9.2333555  14.344335   .08119465    .8844633   .9456708    .479256   .9099222  .59328413 .44407755    .495029   21.46032   4.943889      .660214   .8159728 .026154125   .3467175  .01045863   .7197391   .8170773   .11128785
             3.5836806 -1.0660337  .019561933   .04294183  .28832573   .8422471  .59517455  .07054139  .8752422  .20595577  17.208447 -1.5517355    .20679154   .5233816   .6076019   .8606435   .9645109   .8719822   .3164589  .015902407
             4.6732235  1.1027112    .7800556   .30335885   .4396087  .07067948   .8313749  .19818223  .7867025   .9187906  32.408894 -3.6934524   .015391795   .7084118   .3154456   .3810022   .9577004  .15796787   .3109601    .3814307
              3.007774  -7.492452    .6551434   .29127693   .4645921   .1611242  .28253615   .6048834  .1572782   .4756601    24.8413  -8.702329    .53569984   .9030942   .3600964   .8056107   .7430707   .6139817   .9764298    .6297441
             -1.842861  -5.773968     .356696    .7255588   .3865401  .10907575  .23971276   .1087524 .27406338   .5444241   21.01994   -.629715     .7333733   .6869112   .5671254   .8136181  .18659315  .58522475    .406289   .10940401
             -.7880877  1.6765296    .8314545    .9708208   .8731089  .07766458   .5279128    .736183  .0514437  .50533026    11.3722   3.190321     .2258395   .9968397   .9169284  .03878704   .3166331   .1112451   .6292545    .1617824
              8.826948   3.187879    .9605365    .9460785   .3049064   .2436159   .4593352   .9641259  .8189144   .8138312  2.6163034   4.423594     .8826448   .1181663   .1725679   .5372565   .6843136    .381859    .748471   .08262751
              6.143428  -9.381782    .3513183   .21558407   .6228138  .55576116   .7360972   .5083513  .6958231   .3443114   12.83803  -10.09383     .3247132   .9748631  .49693385  .10580768   .6223538  .02476579   .8816913    .3188482
              2.910725  11.440895    .7021965    .6886892   .2484271 .032400504   .6399885  .18233414  .3584895  .22510156  18.781612  13.473182     .8886811 .017226761   .4819366   .3576823   .9844927   .8766989   .3826619    .6538183
            -.01910355  11.351033    .7012697    .3044476  .53858155  .19255306   .9325299   .6801886  .9933306   .3513104   6.190617  12.988806   .017440306  .23746087  .34410045   .4323122    .997847   .5828864  .20823857    .4844742
              5.869545   .3842752    .1590071    .8042786   .7685767    .591908  .48356155   .2710577  .3135511   .6470584  20.391165   .5830765    .24487182  .32951865  .04372491   .4765914   .3219148  .05568338   .8554016    .9449332
               8.42517   4.000456   .10421273    .7527171  .13601346   .5036635   .3937406  .59113914 .11234523   .8859124 -1.9617815   9.030897    .20672686   .8849649   .8765211   .9055776   .8164402   .8660638   .2748405   .04525834
             1.0256265  -4.411503     .587661    .7215877   .4868949  .58578867   .9434718    .167003 .18075293  .50584596    14.3256  -7.468643    .58383894   .6140801  .10542974    .483955 .023484446   .3954102   .9000874    .6142519
              7.220366   1.555078     .421985    .7384753   .9535264    .719876  .18560483   .6989383  .1626574   .8217133   7.819638  1.9309056     .7049119   .8670255   .9373983   .0982142  .13524559  .19270226   .6975022    .5908164
             .05553389  2.0271823    .9762383   .27689284  .28218526   .8562716  .14018837   .9243227  .5589662  .48594835  37.629707   5.637525      .470753  .11522907   .8330294   .1727472  .09665374   .5441385   .3125472    .3630925
              6.899018  1.9859043     .058898    .1559232  .09017162   .2703315    .483214   .8896965  .6087737   .4337145   7.036071   4.046828    .40200615   .6180297   .6199465  .10923272   .0792642  .53485274   .3141146    .9974476
              5.162345    10.6897    .2021227   .14040683  .11020612  .13858472   .9641232   .3728328  .6564038   .2379806   22.00786   5.484067     .3003853  .25311214  .18212424  .10189297   .7550149  .38358605   .9056385    .5423845
              5.767989 -4.1847405    .3215277   .29047707  .51571167  .24606892   .9234465    .472235   .888424  .28947598  21.819736  -2.972531   .006623633  .24378547   .4261419   .3578495   .5210407  .02438328   .4725561    .7031847
             13.469367 -3.0618155 .0089331865   .13427253     .67599   .5132767  .22457673   .4425471 .11266942 .014299762   12.43255  -2.948586     .3961149    .228667   .7907576  .29463893   .4160217   .5734851  .08901233    .9813368
             4.7499595  -.9241264    .4874784    .8724514   .5233868   .3092771   .5327611   .5432316  .9822041  .09760275   22.64454 -2.2141244     .7211111  .08731584   .6790614   .3306719   .5082545  .05923644   .5563002    .3145017
               5.67865   7.079779    .3321862   .08382535   .9805942   .6233375   .8039415  .54892385  .2915758   .7838277  30.125263    8.41538     .6343603   .6773128 .006442578    .661827   .7736661   .4310149   .6831278    .3159737
              7.258149  17.782312    .4043443   .09111737  .08803286   .8424903   .3428416   .8264416  .9981554   .3737929   8.874355  13.653555    .12353604   .7306813   .0883287   .1320511   .9306643  .06914135    .964214   .08583455
              6.426909   6.921942    .4575412   .21465635   .1747513   .4821788   .3253377   .4515846  .3783087   .9790254  10.216143  -.7457681     .4056266   .6058227   .9034312   .6063462   .3014514   .2467205   .8814751    .9934787
             .16094235  .05704967    .7559011   .14340399    .785883   .3339492  .20994224   .2398703 .22228684   .3931308   32.72918   -1.80385     .9501313    .435995  .27025303  .40160185   .4051726   .6585213  .14298573    .3599437
            -11.764844  3.5335085   .15291964    .7886472   .9142247   .1003559    .668242   .2545229   .433085     .90615  18.904703   9.183543     .9149532   .5077908   .2268948   .3746595   .7324044   .9842812   .4680491    .5823215
              8.353702  12.797104    .3250927    .6631094   .6455531   .1056969   .3007942   .7346343  .6399103  .22554666   21.08388   9.784642     .5950673   .5595049   .1594333   .8418582   .9127513   .6226699   .8328142    .8810913
             -3.848245   6.994652    .9440667    .8112432   .9261009  .18701406   .2001079  .06971657 .06532325    .941138  20.097565   9.696981     .3617445   .3741969 .014155326   .9287853   .3136128   .8265495  .22167833    .8701695
             -9.956572   .3289531   .17099117    .8240173 .036537822   .7945114   .7940629   .4771245 .05928829   .4445741   32.50397  1.0639226     .8399968   .5675536   .7539816   .2209181   .5450702  .12656611   .7714597     .679123
            -2.5094004  1.9706045    .6692844    .3172322   .3332068    .828171   .6548323  .20133163 .02450456   .7080989  29.184875  1.1190509     .2747885  .56646615   .6313942   .7700577   .6073816  .14074817   .7881832    .2969115
              .4762815   3.292145   .14427297 .0003778321   .4530721   .5716221    .564894  .26376486  .5699531    .039887  14.863487   4.799315     .7595631    .968271   .7251238   .6925346  .07482947   .7389464   .6923009   .02238756
            -.26653084  4.0617933   .20126833   .26596707   .9704043   .6523324  .12392586   .8482223  .5638601   .4782776   9.005189   3.920976     .9941382   .0751499   .5584149   .0561162   .4183748   .8721747  .22396973   .24290386
              3.236763 -.06499314  .021285385   .06338029  .08739574   .3522049   .3264685  .10566724 .16754876     .22255   6.564818 -1.0439804     .7446451   .6293616    .733624   .2212867   .5599613   .8622605   .7534544   .40280235
             4.4543114 -2.2985861    .7526946    .4405374  .09465648 .027500056  .11593458   .6248797 .12910965  .06655985   29.55748   .7914088      .627162   .8109628   .4229197   .5845691   .3340978   .3626462   .8763643   .14471315
             3.4102495  14.070042   .18422593    .7352452   .5481788   .6653894   .3383161   .7358543 .58058304   .4775105   40.47058  17.171366    .28977874   .4810134   .9798591   .8521158   .4159498   .7259645   .4916533   .10420745
             1.8300363   7.839327   .58717006   .59292585  .05910619  .19647554   .6606141   .4723726  .5773602  .23679903   21.32685    9.75312    .24046357   .5985539   .0996056    .306455   .4785972   .7931526  .24331486    .8870184
             10.286015   3.759911    .5621675   .27223426    .824973   .6219567   .3500348   .1761952  .6351413   .7051558   8.091856 -3.9064496     .5482864   .4894108   .1898475   .2384248   .7279884 .014506968  .16648263    .7252631
              8.394764 -1.0202087    .3109555     .997188   .9115309   .3718634   .4626978   .3123728  .2412109   .8972312  18.890694   4.174401    .25469995   .7967921   .7755148    .423503  .00592006   .8763096   .3877745    .8629562
              2.761771   9.300408    .1083876   .11192109   .4769292  .56653476   .7760074  .11952494  .8126563   .7290687   5.217161  12.328524     .3790731   .6989579    .538917   .7172167    .539361  .14394686   .9367798   .59473366
             12.791423  -.9581565    .8641324   .09149858  .06741554   .3351752   .7277995   .6002021 .07937538   .4591964  16.859251  -5.239486    .14159387    .561728  .07249633   .9083919   .3045817   .1387359  .07946966   .59056205
              9.626434  1.1727715    .9604624    .7447154   .6219077 .020367345   .6014288  .13781105  .5999127   .8133337  13.430263   3.219203     .4984015   .2766567  .57722485   .2062813   .6155252   .9715033   .3147927    .0969545
            -1.5476307  1.9203595   .04213694    .8432269    .275322   .7234589   .2753556   .2866758  .3709933   .5179286   31.53469    7.12482     .8171654   .5201123  .09848589  .23686907   .7500126   .7968649   .1543517    .6491948
              6.452813  11.444448    .7544023    .9043961  .27632305   .5783467   .2907629  .28925544 .41944245   .8283944  18.394133  11.546082      .766519   .0485412   .7472461   .4369725    .717357  .05358746   .9423221    .5227205
             15.746616  1.8559272   .11324778    .4373816   .6981434 .016439106   .8185375   .2923366 .06149925   .6049325   14.31916   .6462361     .4035936  .56283796   .4268949   .8407211   .8634841   .6505014  .21453394    .4932292
             -4.184621   5.372938    .5731504    .9189357  .24318904  .51008964   .8981422   .3140825  .1936884  .08269428  30.954933  2.0195243    .09911993   .4308617   .3388348  .17939274   .9985999   .8643972  .15732403  .012055526
             19.868666  11.507782    .3708564    .8466811   .7387884   .7959539  .06536807   .8004441  .8110216  .09258007  29.768835    5.12764     .6126292  .11466528   .9015868   .3944671   .9443818   .7452327  .07473308    .2564671
             -5.239051  2.4771886   .56029755    .8354875  .13865733  .10716458   .3708541 .016640136  .8141925   .6731762    44.5772   1.079835     .4706297   .6850701   .7615869   .8264376   .3563417   .7108168   .8328666    .5752788
             -.5469622   -5.25029    .4280561    .9462112  .24224482   .8680494   .8812542  .10232464  .9232002  .21599686   22.28993 -3.4062355     .6700398   .0815424   .4551902  .14455774   .9569796  .02665509   .9563246    .7916105
             -.9875136 -3.0878334   .25556746   .18490495   .8161218   .6878968    .413769  .25692734  .3270817   .7775031  16.277676 -1.4784638     .4384699   .2786369   .3541025   .7235396   .7385719   .0819169   .3774224    .9515697
             4.4574523   5.422514    .0935955    .2025986  .53906804    .391613  .12305644  .56198364  .3054166   .8168428    28.5917   5.306523    .02711774  .09391225   .8934913  .23302703  .18932204   .3869696   .8130016   .11921179
            -1.3698035   4.930118   .11855792    .0855845   .2353709   .4497469   .3475428   .6811972 .09028915   .9388226   44.64806   5.287953     .7919037    .636588  .04449461   .5421417   .7142631   .8354042   .4645287    .6047148
            -4.3625126   1.021275    .8575507    .8973282   .3372553   .9654579   .0243434   .4403451 .10040925   .8585112  22.760817  4.0190735     .8894401    .723623   .5547228   .6085058  .12494684  .52077234  .19632107   .04478481
              6.222983   5.251795    .3910225    .7959391 .027716277 .033420935   .9826987   .4915616 .35265315    .592893  31.549316   4.863791      .372983   .6629182  .03942971   .8869545   .3895595   .6390074  .09108885    .6866304
              7.622895   12.06782    .8159919    .8502334   .3553317   .1719129   .6903236    .517143  .9041818  .06695043  23.267767   7.511736 .00028821192   .7269681   .6222833  .50296664   .7241679   .7567223   .6433516    .0575332
              8.109869  4.1472735   .12547621    .3651932    .949955    .582074   .9732294   .7444012 .08531906   .6973255  23.411766  4.0217743     .8984238    .306365   .4502635   .6434997   .7695182   .6781569   .9851184    .7488699
             11.355505   -4.39428  .003448891   .03972807   .6000562   .7572137   .8968918   .9890965  .2316725   .9577807   28.60299  -4.262612     .8822002   .6128506   .9511628   .8336937    .962721 .011129755   .6454409    .8084012
             1.2778815 -3.4711804    .7529196     .858534   .3220067   .7511303   .5333537  .24062274  .7693949   .3499218   21.19937  -2.019245     .6922931   .5123788   .4707935   .1925408   .7105037     .65435   .3168468   .11046728
              3.961429   8.761336   .13414234    .6391721  .51938903   .2084108  .50769734   .3494626  .4492286   .7438123  -.9590269   3.004428     .4723527   .3360295   .4221028 .031268395   .6710823   .4119196   .5051989    .8605731
              5.473168   9.304689    .8328871     .833865   .4254095   .9771959   .8879416   .0656982  .7403995   .4944099  26.087053   6.941305    .10340178  .02193043 .032320257   .7588967   .7350713   .2849713   .6013682    .8834895
              6.590372   8.143937   .19332026    .4407154  .48586395   .5811633 .071221985   .6398578  .2738417   .4959511  13.219144   8.446115     .5459082   .4370439   .7587306   .9757705   .4302532  .03769438  .29007772    .5637361
            -1.3582414 -3.5071015    .8324136    .1469714    .758096   .1950467  .09194168  .35936505  .7549242   .3007656   1.222828  -.4795649    .03983431  .29530206  .11393215   .9800299  .18812974    .330057    .552299   .14043736
              2.562342    9.18734    .4476398  .003954357   .1562156   .7743061   .7525998   .5152301   .697482   .8576565   21.41541   8.985153     .7155131   .9064275   .6175225   .7021405  .05425825   .9986599   .4505732    .5383547
              .8037827  13.992144    .1154926   .56796324   .9457136  .46003985   .9337158   .7350671  .4846907   .5852824  20.781906  13.510776     .2044689   .4832753   .2213902  .14848526   .9048155   .8405548   .4338235    .8067084
              11.07561   8.889474    .3224011    .2106113   .4628302    .660461   .8335006   .9350447  .4248196   .7916537   26.81649   11.97317     .8752485   .6391397   .9596077   .9907883   .5874504   .9705344  .26640788    .9382201
              4.895733    10.7835    .5912329    .7563699  .09631136   .9930941   .4834635   .6804651  .3829168   .3181035  24.808514  12.420574     .3718975   .3757029   .9427465   .6177549   .4339132  .03785124 .019264365  .019874465
              9.435315   6.316013     .742907     .483353   .9232677  .24238534   .9053339   .8408629 .16303085   .6770931  13.524135    6.27883     .8368753   .6434054  .24944343   .7825908   .6896602  .28538573   .7856113    .9058075
             -.5535343   2.143365    .7559323    .3687886 .005267374   .8955364  .07908168  .29538435 .08597624   .3796694   9.564178    5.09418     .7972363  .47409105   .3826961   .4283234  .06534621   .9934307   .7106711   .13751286
              8.567056   1.242173   .07799873    .8063906   .6440391   .6175493   .8335356   .7979074  .7895014   .4997455  16.600428   3.477687     .7793288 .033930518 .029629827   .4554164   .5779033   .7037196  .28585383   .04915093
             15.008123  -.8023006    .9606185    .7526915  .21135855   .7016616   .6858252   .7123153 .43294585 .011731582   7.831995  -4.661063     .8710195  .19082133   .7860022  .14249931  .23163764   .8617802 .035859615    .9881561
              1.164239  15.643858     .366159   .26410568   .7887504   .9211621   .4071067   .4383166 .04076659  .13124876  1.3158286   13.30173     .1809516  .18742865  .07480268   .9663154   .7516085  .26045814   .4470498    .9065468
             1.5156246  3.0132954     .728358    .6083057  .59541786   .6153016   .9794214   .6909013 .08582237   .4759914  15.031084   4.894832     .8949009   .3478262   .3977827   .3203734   .3446277   .4660932   .9381133    .6839198
              10.03902  1.0436313    .4358361    .5284772   .9860185  .11093303   .8219739  .12698555   .557955    .737856    15.9664  1.8761377     .3496222   .8871928   .6318613  .24466504  .19652162   .9189752   .5068097    .4336797
             14.333726   12.97899   .14445524    .7240822  .54208183   .4534446   .8887736  .19806434  .4684015   .0761353   11.56717   10.28966    .07554335  .19211394   .7340063  .19568014 .031591162  .20177323   .5092734    .2023265
              6.184639   7.848447    .5222717   .29453346   .7329882   .4169442   .9880131   .3223091  .6278751   .9326828    23.2866    5.06356     .8128985   .6051359   .8274027 .011176432   .9833465  .13493073   .8647637    .7398451
              5.904042  .10223951    .4184419     .934348   .3950971   .5246582  .13755782   .7933009  .8838093  .24735647  29.620165   .6701006    .02413011   .3977599   .7457362   .3271437   .5843597   .1894251   .3335266    .7186766
             19.052305   8.329021   .28695825    .8151794   .6389565   .6879286   .6774128 .007907399  .3666382   .9686212   9.720041   6.889057     .5523863     .79322  .29780352    .391888   .8339642   .8389922   .8048492     .491484
              -3.77154   9.530327   .53916496     .826618   .2340082  .06233386   .6368857      .8201  .9970746   .4595353   36.16204   8.192802     .9865241  .18963243  .56092584   .7950085  .15363595 .015217925   .5219659   .58818936
              .6846251  10.229602    .5611674    .8860307    .747274  .13550913   .1895793   .3594919 .56809753   .9562925  11.574862   7.363417     .5525427   .6371259   .1491917   .8403611    .193781  .17397477   .3144785    .3495468
            -1.0600654  1.1222844   .47012335    .7067978  .14304711   .7076076   .9026325  .23452345   .897108  .14275447  17.580528  -.6274094     .3550118    .759917  .50807446   .7453293  .24170403   .8132588   .9666622 .0045228554
              2.882244  12.004786     .941715    .2961064   .4020037 .019786656  .12795441   .6362212  .9879331  .14921252   38.71186  11.074103     .4236883   .8745297  .50386965   .6352745   .3591713   .3895709  .16213706   .11100896
            -1.2960554  -3.577111    .9601552    .6701748 .033829875 .001561518  .25454038   .6758744  .8677076   .8411436   24.24663 -2.6083755     .1770748  .47317085   .2163847   .3812637   .4360264  .11266237    .920109    .3714406
             16.355972   6.577126    .9865602   .13419913   .4530488  .07552875   .7709779   .7459202   .385296   .8003917   15.82321   3.618231     .9224355   .9985808   .2174223   .2335838   .1529408   .7863641 .013034086    .3127747
            -4.1072445  3.8387966    .7058836    .3183507  .26974148    .643578  .02708902 .006403041  .0095048  .51868844  11.534518   5.416826     .4117713   .9257677   .4784922   .6900166  .10413537 .033800665  .17983377   .21252215
              3.180989  -7.840789   .21660325    .4883266   .1541432  .07882655  .06796903   .5452905  .4982932   .3184912   15.76491  -5.799705     .6231912   .6101886   .8170735   .3413603   .4641545   .6116723  .04876747    .6324683
             -10.92465  2.6258016  .004203058    .6664088  .58133453   .4530162   .3225467    .300791  .8735986    .368871  27.830975   5.429988    .23491123   .8286883  .06286132   .9653498   .3313137   .8177555   .9426997    .4163224
             10.792866  .16997983    .2875256    .6131955   .4011652    .746203    .481704   .5037439  .9460629  .55977285    43.7271 -.30726105     .4829012   .7551122  .27010363     .38371   .7304639    .710179   .4012978     .833008
             4.1206193   .9987981   .53706163    .1901031   .6564639  .59360373   .4402054  .11132433  .4333094   .6471773   5.764264  4.4137616   .068542786  .29178423  .25411803  .20796667   .3252583   .7545071  .19062404   .04765549
              5.415081   5.298575    .9175555   .14791898  .32361645   .3728811  .22883134  .09983407 .54003954    .713636  16.371178  2.1479046     .6052354   .9184047   .5829084   .7317901   .9328628   .9806913   .5034052    .7592264
              22.50936   7.499469    .7864969    .6093209   .4003197   .4194219   .5999886   .4604873  .4686923  .47022974  22.856764   6.164132     .1470323    .498381  .21495496    .333421   .6551857   .6327886   .9120597   .04235314
              7.483903 -4.4110785   .02082244   .04575173     .82927   .2408066   .7216517   .6030566  .4074611  .55436283  15.483662  -5.795539    .18070504   .4459683   .4217145   .7385077  .57895917   .6691958   .3478126   .05141381
              6.933421   7.894638    .4170643    .8224061  .13805601   .7858984   .4398953   .7871741  .7392096   .6791406 -13.580204   9.160413     .8849521   .6471993   .4031877   .3550904   .9726827   .1647944   .6798093    .9923928
             14.531067  -2.505311    .4554886     .417355   .3397121   .7349977  .02206678   .6105488 .08331744   .1023737   27.87213  -.5085866     .9162734  .20447685   .3398798   .6436362   .4829344   .4191574    .413601    .7681562
              9.705017    5.07276   .50991046    .4036203  .49737185   .6887786  .14652888   .3232383  .1493003   .6449498  13.339443   3.819671      .673403   .9517537   .3641084   .6055189   .9054072   .7150618  .15951453    .8023275
            end
            
            //    GO TO LONG LAYOUT WITH TRAINING SETS "STACKED VERTICALLY"
            gen long obs_no = _n
            reshape long y_tilde_test_ y_tilde_train_eq x_1_test x_1_train x_2_test x_2_train ///
                x_3_test x_3_train x_4_test x_4_train, i(obs_no) j(set)
            
            
            capture program drop one_set
            program define one_set
                // DO THE TRAINING REGRESSION
                regress y_tilde_train_eq x_*_train
                // GET THE COEFFICIENTS IN MATRIX b
                matrix b = e(b)
                // APPLY THE COEFFICIENTS TO THE x_*test VARIABLES
                matrix colnames b = x_1_test x_2_test x_3_test x_4_test _cons
                matrix score xb = b
                // CALCULATE MEAN SQUARE RESIDUAL
                egen wanted = mean((y_tilde_test_ - xb)^2)
                //    FLAG ONE OBSERVATION PER SET FOR LATER LISTING
                gen byte flag = (_n == 1)
                exit
            end
            
            //    RUN THE CALCULATIONS ONE SET AT A TIME
            runby one_set, by(set)
            
            //    LIST THE MEAN SQUARE RESIDUALS FOR EACH SET
            list set wanted if flag, noobs clean
            When you are done, your data set will have two new variables. Variable xb contains the value predicted by applying the coefficients to the x_*_test variables. Variable wanted contains the mean of the squared residuals. And the Results window will contain a listing of the mean squared residual for each set.

            To use this code you must install the -runby- program, written by Robert Picard and myself, and available from SSC.

            If this is not what you are looking for, please post back with explanation.

            If you have a compelling reason to go back to wide layout after this runs, you can -drop xb flag- and then -reshape wide-. This will restore the original layout and retain the variable wanted as well. But I caution you that whatever you plan to do next with this data is likely to be easier if you stay with the long layout. There are only a limited number of things that work easily with wide data in Stata.

            Comment


            • #7
              Many thanks Professor for you patience.
              I will try as soon as possible the code and let you know!

              Federico

              Comment


              • #8
                The code seems to work perfectly fine. Indeed what I would like to obtain was the mean of wanted which I simply obtain by summarizing it.
                Many Many thanks again Professor for the patience and effort devoted to my problem,

                Federico
                Last edited by Federico Nutarelli; 11 Jan 2019, 14:36.

                Comment

                Working...
                X