Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Finding rolling AR coefficients for multiple countries at once

    Hello everyone,

    I was interested in calculating rolling autoregressive coefficients (AR(1)) for a period of 10 years window for multiple countries at once, and I calculated it using rangestat wherein I regressed inflation on its first lag for all the countries, the command is “rangestat (reg) inflation inflation_lag , interval ( Year -9 0) by (Country)” however when I used the rolling option in Stata for calculating rolling regression which is “rolling, window(10): arima inflation, ar(1) ” and then doing it separately for all the countries since "rolling" does not allow the “by” option, the results are varying, please guide further as to which method should be used or where I may be wrong. I am using Stata 17.

    Thank You

  • #2
    Hi Stata users, I want to calculate the first order autoregressive coefficients (AR(1)) of a time series variable (say: inflation), for multiple entities (say: US, UK, China, etc ) for a 10 years rolling window in a single go, please suggest ways to do it.

    Thank You

    Comment


    • #3
      The two models you describe in #1 are different, and there is no reason to expect the results to agree. What you are doing with -rangestat- is a regression like:
      Code:
      inflationt = constant + inflationt-1 + epsilont
      What the -arima- model fits is:
      Code:
      inflationt = constant + mut, with
      mut = r*mut-1 + epsilont
      The first of these is a lagged model, not an autoregressive one.
      Last edited by Clyde Schechter; 20 Aug 2022, 16:17.

      Comment


      • #4
        Thanks @Clyde Schechter for your reply and the clarification, I really appreciate it but how do we calculate the rolling AR(1) coefficients for a window of 10 periods for multiple countries at once since using “rolling, window(10): arima inflation, ar(1) ” allows calculating it for only one country at a time?
        Last edited by Itesh New; 20 Aug 2022, 20:53.

        Comment


        • #5
          Something like this should do it:
          Code:
          capture program drop one_country
          program define one_country
              local country = Country[1]
              rolling, window(10) clear: arima inflation, ar(1)
              append using results
              gen Country = `"`country'"'
              save results, replace
              exit
          end
          
          clear
          save results, emptyok
          
          use inflation_data_set
          runby one_country, by(country) status
          -runby- is written by Robert Picard and me, and is available from SSC.

          At the end of this code, the data file results.dta will contain all the arima coefficients for each window for each country, identified by country and time period.

          This code is untested because no example data has been provided. The code will be adapted to use the actual variable and data set names you are working with. If you need additional assistance with this, when posting back be sure to include example data that demonstrates any difficulties you are encountering. And be sure to use the -dataex- command to do that. If you are running version 17, 16 or a fully updated version 15.1 or 14.2, -dataex- is already part of your official Stata installation. If not, run -ssc install dataex- to get it. Either way, run -help dataex- to read the simple instructions for using it. -dataex- will save you time; it is easier and quicker than typing out tables. It includes complete information about aspects of the data that are often critical to answering your question but cannot be seen from tabular displays or screenshots. It also makes it possible for those who want to help you to create a faithful representation of your example to try out their code, which in turn makes it more likely that their answer will actually work in your data.

          Comment


          • #6
            Thank you for your response @Clyde Schechter, I have hereby attached the dataset:
            Code:
            * Example generated by -dataex-. For more info, type help dataex
            clear
            input str15 Country str4 Year double Inflation str3 Code
            "Australia" "1950"  8.69565217391305 "AUS"
            "Australia" "1951"  19.4285714285714 "AUS"
            "Australia" "1952"  17.2248803827751 "AUS"
            "Australia" "1953"  4.48979591836735 "AUS"
            "Australia" "1954"            1.5625 "AUS"
            "Australia" "1955"  1.53846153846154 "AUS"
            "Australia" "1956"  5.68181818181819 "AUS"
            "Australia" "1957"  2.86738351254481 "AUS"
            "Australia" "1958"  .696864111498258 "AUS"
            "Australia" "1959"  2.07612456747403 "AUS"
            "Australia" "1960"  3.72881355932204 "AUS"
            "Australia" "1961"  2.28758169934641 "AUS"
            "Australia" "1962" -.319488817891374 "AUS"
            "Australia" "1963"  .641025641025642 "AUS"
            "Australia" "1964"  2.86624203821655 "AUS"
            "Australia" "1965"  3.40557275541798 "AUS"
            "Australia" "1966"  3.29341317365268 "AUS"
            "Australia" "1967"  3.47826086956522 "AUS"
            "Australia" "1968"  2.52100840336135 "AUS"
            "Australia" "1969"  3.27868852459017 "AUS"
            "Australia" "1970"  3.43915343915344 "AUS"
            "Australia" "1971"  6.13810741687979 "AUS"
            "Australia" "1972"  6.02409638554217 "AUS"
            "Australia" "1973"  9.09090909090909 "AUS"
            "Australia" "1974"  15.4166666666667 "AUS"
            "Australia" "1975"  15.1624548736462 "AUS"
            "Australia" "1976"  13.3228840125392 "AUS"
            "Australia" "1977"  12.3098201936376 "AUS"
            "Australia" "1978"  8.00492610837438 "AUS"
            "Australia" "1979"  9.12200684150513 "AUS"
            "Australia" "1980"  10.1358411703239 "AUS"
            "Australia" "1981"  9.48766603415561 "AUS"
            "Australia" "1982"  11.3518197573657 "AUS"
            "Australia" "1983"  10.0389105058366 "AUS"
            "Australia" "1984"  3.96039603960396 "AUS"
            "Australia" "1985"    6.734693877551 "AUS"
            "Australia" "1986"  9.05035054174633 "AUS"
            "Australia" "1987"  8.53302162478084 "AUS"
            "Australia" "1988"  7.21593968766829 "AUS"
            "Australia" "1989"  7.53390256152687 "AUS"
            "Australia" "1990"  7.33302195235871 "AUS"
            "Australia" "1991"  3.17667536988684 "AUS"
            "Australia" "1992"  1.01223112610715 "AUS"
            "Australia" "1993"   1.7536534446764 "AUS"
            "Australia" "1994"  1.96963479688141 "AUS"
            "Australia" "1995"  4.62776659959759 "AUS"
            "Australia" "1996"  2.61538461538462 "AUS"
            "Australia" "1997"  .224887556221881 "AUS"
            "Australia" "1998"  .860134629768143 "AUS"
            "Australia" "1999"  1.48312940304042 "AUS"
            "Australia" "2000"  4.45743514797223 "AUS"
            "Australia" "2001"  4.40713536201469 "AUS"
            "Australia" "2002"  2.98157453936348 "AUS"
            "Australia" "2003"  2.73259596616785 "AUS"
            "Australia" "2004"  2.34325522482586 "AUS"
            "Australia" "2005"  2.69183168316831 "AUS"
            "Australia" "2006"  3.55528773727027 "AUS"
            "Australia" "2007"  2.32761128891474 "AUS"
            "Australia" "2008"  4.35029854990048 "AUS"
            "Australia" "2009"  1.77111716621253 "AUS"
            "Australia" "2010"  2.91834002677377 "AUS"
            "Australia" "2011"   3.3038501560874 "AUS"
            "Australia" "2012"  1.76278015613196 "AUS"
            "Australia" "2013"  2.44988864142539 "AUS"
            "Australia" "2014"    2.487922705314 "AUS"
            "Australia" "2015"   1.5083667216592 "AUS"
            "Australia" "2016"  1.27699094497329 "AUS"
            "Australia" "2017"  1.94864740944519 "AUS"
            "Australia" "2018"  1.91140094445696 "AUS"
            "Australia" "2019"  1.61076787290378 "AUS"
            "Australia" "2020"  .846905537459296 "AUS"
            "Australia" "2021"  2.86391042204996 "AUS"
            "Austria"   "1959"  1.07500000000001 "AUT"
            "Austria"   "1960"  1.94574985571772 "AUT"
            "Austria"   "1961"  3.54225636878283 "AUT"
            "Austria"   "1962"  4.38178551901898 "AUT"
            "Austria"   "1963"  2.70876982939242 "AUT"
            "Austria"   "1964"  3.86857059594927 "AUT"
            "Austria"   "1965"  4.93091113137407 "AUT"
            "Austria"   "1966"  2.05482049298819 "AUT"
            "Austria"   "1967"  3.97466847642697 "AUT"
            "Austria"   "1968"  2.76486608214571 "AUT"
            "Austria"   "1969"  3.08040247815653 "AUT"
            "Austria"   "1970"  4.37282547992571 "AUT"
            "Austria"   "1971"  4.70426154794422 "AUT"
            "Austria"   "1972"  6.35514062696682 "AUT"
            "Austria"   "1973"  7.53108080235445 "AUT"
            "Austria"   "1974"  9.52179141555083 "AUT"
            "Austria"   "1975"  8.44525548786933 "AUT"
            "Austria"   "1976"  7.31868937549702 "AUT"
            "Austria"   "1977"  5.49460978339934 "AUT"
            "Austria"   "1978"  3.57432243431843 "AUT"
            "Austria"   "1979"   3.7074061629985 "AUT"
            "Austria"   "1980"  6.32831779605569 "AUT"
            "Austria"   "1981"  6.80304335998305 "AUT"
            "Austria"   "1982"  5.43602589121125 "AUT"
            "Austria"   "1983"  3.33916684239884 "AUT"
            "Austria"   "1984"  5.66318948991475 "AUT"
            "Austria"   "1985"  3.18951141586567 "AUT"
            "Austria"   "1986"  1.70544550107634 "AUT"
            "Austria"   "1987"   1.4019548916528 "AUT"
            "Austria"   "1988"  1.91572021084487 "AUT"
            "Austria"   "1989"  2.56834671476642 "AUT"
            "Austria"   "1990"  3.26187148033762 "AUT"
            "Austria"   "1991"   3.3374261001366 "AUT"
            "Austria"   "1992"  4.02084684152512 "AUT"
            "Austria"   "1993"  3.63178554402613 "AUT"
            "Austria"   "1994"  2.95340652383276 "AUT"
            "Austria"   "1995"   2.2433637541205 "AUT"
            "Austria"   "1996"  1.86097410872888 "AUT"
            "Austria"   "1997"   1.3059833037407 "AUT"
            "Austria"   "1998"  .922466865804051 "AUT"
            "Austria"   "1999"  .568986519284555 "AUT"
            "Austria"   "2000"  2.34486839687016 "AUT"
            "Austria"   "2001"  2.64999999999997 "AUT"
            "Austria"   "2002"  1.81035882448452 "AUT"
            "Austria"   "2003"  1.35555378358982 "AUT"
            "Austria"   "2004"  2.06120682873104 "AUT"
            "Austria"   "2005"  2.29913772525068 "AUT"
            "Austria"   "2006"  1.44154653778849 "AUT"
            "Austria"   "2007"  2.16855593888618 "AUT"
            "Austria"   "2008"  3.21595111754302 "AUT"
            "Austria"   "2009"  .506309393986597 "AUT"
            "Austria"   "2010"  1.81353173680539 "AUT"
            "Austria"   "2011"  3.28658305428623 "AUT"
            "Austria"   "2012"  2.48567508675653 "AUT"
            "Austria"   "2013"  2.00015749271597 "AUT"
            "Austria"   "2014"  1.60580560487915 "AUT"
            "Austria"   "2015"  .896565286860077 "AUT"
            "Austria"   "2016"  .891592367302709 "AUT"
            "Austria"   "2017"  2.08126858275521 "AUT"
            "Austria"   "2018"  1.99838187702262 "AUT"
            "Austria"   "2019"  1.53089553422702 "AUT"
            "Austria"   "2020"  1.38190954773872 "AUT"
            "Austria"   "2021"  2.76666666666667 "AUT"
            "Belgium"   "1956"  2.81888071422665 "BEL"
            "Belgium"   "1957"  3.16771185077989 "BEL"
            "Belgium"   "1958"  1.28896508728179 "BEL"
            "Belgium"   "1959"  1.22947666456369 "BEL"
            "Belgium"   "1960"  .299455811266827 "BEL"
            "Belgium"   "1961"  .992679932709934 "BEL"
            "Belgium"   "1962"  1.40460397971129 "BEL"
            "Belgium"   "1963"  2.14801550892356 "BEL"
            "Belgium"   "1964"  4.16874923035691 "BEL"
            "Belgium"   "1965"  4.06519894858348 "BEL"
            "Belgium"   "1966"  4.17435116136102 "BEL"
            "Belgium"   "1967"  2.73029902757217 "BEL"
            "Belgium"   "1968"   2.7014551903408 "BEL"
            "Belgium"   "1969"  3.75399172087523 "BEL"
            "Belgium"   "1970"  3.90847050598089 "BEL"
            "Belgium"   "1971"  4.34362863767015 "BEL"
            "Belgium"   "1972"  5.44843179763001 "BEL"
            "Belgium"   "1973"  6.95527790290585 "BEL"
            "Belgium"   "1974"  12.6778879546025 "BEL"
            "Belgium"   "1975"  12.7682037561641 "BEL"
            "Belgium"   "1976"  9.06860615452409 "BEL"
            "Belgium"   "1977"  7.09599737293392 "BEL"
            "Belgium"   "1978"  4.47059464836909 "BEL"
            "Belgium"   "1979"  4.46908426819722 "BEL"
            "Belgium"   "1980"  6.64869013966331 "BEL"
            "Belgium"   "1981"  7.62748649174539 "BEL"
            "Belgium"   "1982"  8.72654957097361 "BEL"
            "Belgium"   "1983"  7.66161008927888 "BEL"
            "Belgium"   "1984"  6.34177636268591 "BEL"
            "Belgium"   "1985"  4.86872975086582 "BEL"
            "Belgium"   "1986"  1.29437081859633 "BEL"
            "Belgium"   "1987"  1.55337967880691 "BEL"
            "Belgium"   "1988"  1.16093620546811 "BEL"
            "Belgium"   "1989"  3.10705254860209 "BEL"
            "Belgium"   "1990"  3.44831009968626 "BEL"
            "Belgium"   "1991"  3.21531063739933 "BEL"
            "Belgium"   "1992"  2.42795636931716 "BEL"
            "Belgium"   "1993"  2.75442578302316 "BEL"
            "Belgium"   "1994"  2.37754452926211 "BEL"
            "Belgium"   "1995"  1.46796116504854 "BEL"
            "Belgium"   "1996"  2.07702462348191 "BEL"
            "Belgium"   "1997"  1.62816049512958 "BEL"
            "Belgium"   "1998"  .949250288350633 "BEL"
            "Belgium"   "1999"  1.12084823417846 "BEL"
            "Belgium"   "2000"  2.54451776190907 "BEL"
            "Belgium"   "2001"  2.46925823086076 "BEL"
            "Belgium"   "2002"   1.6452143617536 "BEL"
            "Belgium"   "2003"   1.5889639997038 "BEL"
            "Belgium"   "2004"  2.09728311239311 "BEL"
            "Belgium"   "2005"  2.78143263670022 "BEL"
            "Belgium"   "2006"  1.79120770070459 "BEL"
            "Belgium"   "2007"   1.8230563002681 "BEL"
            "Belgium"   "2008"  4.48944420508401 "BEL"
            "Belgium"   "2009" -.053145674125376 "BEL"
            "Belgium"   "2010"  2.18929920422456 "BEL"
            "Belgium"   "2011"  3.53208210722745 "BEL"
            "Belgium"   "2012"  2.83966343445896 "BEL"
            "Belgium"   "2013"  1.11309594027539 "BEL"
            "Belgium"   "2014"  .340002833356929 "BEL"
            "Belgium"   "2015"  .561429152790128 "BEL"
            "Belgium"   "2016"  1.97385264653172 "BEL"
            "Belgium"   "2017"  2.12597086002609 "BEL"
            "Belgium"   "2018"  2.05316499865182 "BEL"
            "Belgium"   "2019"  1.43681956996435 "BEL"
            "Belgium"   "2020"  .740791812220349 "BEL"
            end
            Last edited by Itesh New; 20 Aug 2022, 23:58.

            Comment


            • #7
              OK, thanks for the -dataex- example. The data has a problem that needs to be cleaned up first: you have Year as a string variable. You can't do anything useful with that. So we have to change it to numeric. After that, the mean changes from before are using variable names that begin with upper case letters, specifying -tsset Year- inside program one_country, and putting the -gen Country = `"`country'"'- command before the -append- command. All in all, this works in your example data:

              Code:
              clear*
              save results, emptyok replace
              
              capture program drop one_country
              program define one_country
                  local country = Country[1]
                  tsset Year
                  rolling, window(10) clear: arima Inflation, ar(1)
                  gen Country = `"`country'"'
                  append using results
                  save results, replace
                  exit
              end
              
              use inflation_data_set
              destring Year, replace
              runby one_country, by(Country) status
              
              use results, clear

              Comment


              • #8
                Its working, thanks a lot @Clyde Schechter, it has greatly reduced my efforts.

                Comment


                • #9
                  Hello Clyde Schechter , in order to estimate rolling GARCH(1 1) estimates, I have edited the above code suggested by you by using
                  rolling, window(5) clear: arch Inflation , arch(1) garch(1) instead of rolling, window(10) clear: arima Inflation, ar(1). I am using STATA 16, the time suggested for estimation is in hours, can we somehow speed up the estimation process ?

                  Comment


                  • #10
                    Well, you can pick up some speed by using -rangerun- instead of calling -rolling- within -runby-. -rangerun-, unfortunately does not have a -status- option, so you won't be able to tell how far you have gone and how much time remains.
                    Code:
                    clear*
                    
                    capture program drop one_window
                    program define one_window
                        tsset t
                        arch Inflation, arch(1) garch(1) iterate(20)
                        if e(converged) {
                            gen cons = _b["ln_wpi:_cons"]
                            gen b_arch = _b["ARCH:L.arch"]
                            gen b_garch = _b["ARCH:L.garch"]
                            gen b_arch_cons = _b["ARCH:_cons"]
                        }
                        exit
                    end
                    
                    rangerun one_window, by(country) interval(t -4 0) use(Inflation t)
                    I call your attention to the italicized parts of the code. In the -tsset t- command, replace t by the actual name of your time variable. Same thing in the -use()- option of the -rangerun- command.

                    These models can have difficulty converging and by default they will run hundreds of iterations, possibly ending in failure. In order to get this to run with a toy data set, because of these convergence issues, I had to set a limit on the number of iterations, allowing Stata to abort estimation if convergence was not reached within 20. You may not want to do that. I have no experience with this kind of analysis. I know that with simpler models, if you don't reach convergence within a relatively small number of estimations, you are unlikely to ever converge--so I chose -iterate(20)-. You may need to change that to some larger number. Whatever number you choose you are trading off speed of execution with the possibility of aborting some long estimations that might ultimately converge had you allowed it to proceed. If you want every possible result no matter how long it takes, then remove the -iterate()- option altogether. That will maximize the time needed to complete the runs. The -if e(converged)- block prevents you from recording coefficients from a failed estimation. I would leave that in place even if you remove -iterate(20)-. Sometimes these estimations fail even when Stata goes all the way, and the "results" those generate are not valid.

                    Comment


                    • #11
                      Clyde Schechter Thank you for your reply and explanation for using the codes.

                      Comment


                      • #12
                        Hello everyone,

                        I am currently having some troubles regarding the statistical analysis I have to do for my master's thesis. It is about the effect of macro-economic news on the reaction of stocks in the euro area. Macro-economic news has two components: an expected and unexpected component. Since it is the unexpected component of news that moves stocks, this is the component I would like to calculate starting from the actual announcements. My data consists of monthly data of 1) S&P Euro stock index and 2) macro economic variables (M1, inflation, industrial production, unemployment and the interest rate).

                        So what is the problem: I do not know how to correctly calculate the unexpected component of these macro-economic variables. As of now, I used the following command:
                        Code:
                        rolling, window(36): arima M1, ar(1)
                        (assuming I should do this separately for all five independent variables).
                        Any advice on how I should do this would be much appreciated, as well as any help on how to interpret the output (generated by the command above) below:

                        Thanks in advance.

                        Code:
                        * Example generated by -dataex-. For more info, type help dataex
                        clear
                        input float(start end M1_b_cons _stat_2 sigma_b_cons)
                        480 515  .07932618 .9353847  .009162324
                        481 516   .0818506 .9436406 .0089879455
                        482 517  .08352173 .9440121  .009145501
                        483 518  .08836482 .9509076  .009414546
                        484 519  .08284108 .9456822  .009082168
                        485 520  .07915594 .9454264  .008789787
                        486 521  .07867705 .9515781  .008670864
                        487 522  .08134738 .9531985 .0085833715
                        488 523   .0813597 .9593607  .008504648
                        489 524   .0792206  .957567  .008513007
                        490 525  .08061783 .9650205  .008543259
                        491 526  .07663073 .9556202  .008779268
                        492 527  .06649369 .9846039   .00704677
                        493 528  .07100493 .9842466  .007046339
                        494 529  .06646622 .9862801  .006951232
                        495 530  .06968245 .9870193  .007026014
                        496 531  .07307028 .9799441  .006831652
                        497 532  .07550673 .9702542  .006767166
                        498 533  .07376206 .9738882  .006725197
                        499 534   .0763388 .9703032  .006686086
                        500 535   .0808893  .957343  .006367532
                        501 536  .08411803 .9525363  .006401483
                        502 537   .0867096 .9388686  .006241961
                        503 538   .0876328 .9426321  .006338665
                        504 539  .08935062 .9213593  .006277723
                        505 540  .09101281 .9207133  .006508478
                        506 541  .09085558 .9265967  .006508137
                        507 542  .09256272 .8976028   .00647755
                        508 543  .09416974 .8813589  .006434784
                        509 544  .09623832 .8660511  .006638716
                        510 545  .09905039 .8300025  .006548184
                        511 546   .1006132 .8244336  .006672222
                        512 547  .10201915 .8009965  .006593645
                        513 548   .1021928 .7632863  .006604627
                        514 549   .1035999 .7254493  .006368722
                        515 550  .10415918 .7043266  .006202838
                        516 551   .1045611  .723392  .006345647
                        517 552  .10475463 .6801503   .00628013
                        518 553  .10521308 .7212854  .006085743
                        519 554   .1046942 .7113643  .006033043
                        520 555  .10465998 .6981124  .006031207
                        521 556   .1046875 .6995688  .006031062
                        522 557  .10341063  .687377  .006395015
                        523 558  .10140558 .7693549  .007197531
                        524 559  .09910332 .8229185  .007236489
                        525 560  .09875224 .8622571  .007208604
                        526 561   .0936329   .90042  .007069353
                        527 562  .09367334 .9017966  .007067419
                        528 563  .09654123  .861899  .007334284
                        529 564  .09372638 .8743276  .007530947
                        530 565  .09353873 .8926763  .007518577
                        531 566  .09170455 .8816487  .007322252
                        532 567  .08728353 .9023033  .007441844
                        533 568  .08613554 .9104654  .007453999
                        534 569  .08607643 .9085376   .00745261
                        535 570  .08666462  .891905  .007499603
                        536 571  .08563065 .9037842   .00758984
                        537 572  .08417293 .9053691  .007537742
                        538 573  .08457255 .9139411  .007479163
                        539 574  .08138794 .9184728  .007256314
                        540 575  .07652767 .9461316  .008097257
                        541 576  .07706307   .94507 .0081001315
                        542 577  .07120135 .9528503  .008075324
                        543 578  .06974936 .9570079  .008088255
                        544 579  .06888935 .9695005  .008009142
                        545 580 .068165205   .97587  .007988288
                        546 581  .06809237 .9814727  .007939577
                        547 582  .06165279 .9853913  .008337168
                        548 583   .0583958 .9838348  .008249018
                        549 584  .06175548 .9799884  .008380052
                        550 585   .0659189 .9611168  .009006826
                        551 586 .065685585 .9713941  .009041717
                        552 587  .06316482 .9565327  .008918492
                        553 588  .06457108 .9434626  .009199008
                        554 589 .066452615 .9392979   .00935881
                        555 590 .067731954 .9426968   .00934034
                        556 591   .0717068 .9386138  .009950774
                        557 592 .067132086 .9275765  .009682606
                        558 593   .0665227 .9305068  .009486265
                        559 594   .0745532 .9487758  .010269128
                        560 595   .0816131 .9600549   .01048838
                        561 596  .07894092 .9603982  .010397516
                        562 597  .07575338 .9523242  .010510196
                        563 598   .0843039 .9603742  .010282312
                        564 599  .08190444  .961467  .010156567
                        565 600  .07990945 .9588315  .010191162
                        566 601   .0792579 .9568965  .010205692
                        567 602  .07599678 .9579973  .010061257
                        568 603  .07504133 .9568154  .010073804
                        569 604  .07551255 .9557552  .010063123
                        570 605  .07623487  .953375  .009991742
                        571 606  .07126455 .9503843   .01010896
                        572 607    .070121 .9495849   .01013806
                        573 608  .06577413 .9470462  .010468002
                        574 609  .06256123   .94757   .01059843
                        575 610  .05436554 .9566014   .00993976
                        576 611  .05446364 .9563611    .0099386
                        577 612  .04897627 .9599039   .00997514
                        578 613  .04691422 .9612166  .009996225
                        579 614  .04432635 .9632082  .009953639
                        end
                        format %tmMon_CCYY start
                        format %tmMon_CCYY end

                        Comment


                        • #13
                          #12 cross-posted at https://www.reddit.com/r/stata/comme..._component_of/

                          Please note our policy on cross-posting, which is that you should tell us about it. Indeed r/stata on Reddit have the same policy.

                          Comment


                          • #14
                            Please excuse me for not mentioning that, I thought it was only necessary to mention once I got a reply on the other platform.

                            Comment


                            • #15
                              There are two kinds of readers to think about. One is people who might answer the question, and believe me it is irritating to spend time answering a question only to find that someone else has posted the same or a better answer elsewhere. The other is people who are interested to see answers, and if you are interested to see an answer in a forum you scan you should be interested to see it somewhere else. My guess is that the vast majority of Statalist people who look here say daily don't do the same for Reddit r/stata, and vice versa.

                              These are the reasons why -- while it is entirely understandable that you might want to post in different places to increase your chance of a reply -- it is standard in any good technical forum to ask for information about cross-posting.

                              All that said, I can't see that your question is a good fit for this thread with a title that doesn't fit what you are doing.

                              Otherwise, sorry, but your question is some distance from what I ever do and I don't have useful suggestions.

                              Comment

                              Working...
                              X