Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • GARCH in large panels

    Dear all,

    I am doing bilateral trade research and use pairs of countries as my panels. This implies I have around 22.000 panels. I would like to use GARCH to calculate the volatility of the bilateral exchange rate of each country pair. I would therefore have to do 22.000 GARCH calculations. Writing the code to do so could be automated to save time, but running 22.000 GARCH calculations would take days. Is there perhaps one command that would allow me to do this relatively time-efficient? Surely there must be more individuals that have executed GARCH analysis in large panel data set. I, however, can't find a satisfying answer online.

    So far, I have been using this code:

    arch BilER if pairid==1 , arch(1) garch(1)
    predict vol1 if pairid==1, variance
    arch BilER if pairid==2 , arch(1) garch(1)
    predict vol2 if pairid==2, variance
    gen 1Vol = Vol1 + Vol2
    drop Vol1 Vol2

  • #2
    Bump

    Does anyone know?

    Comment


    • #3
      I would post an example of your data using -dataex- (see the FAQ) or David Benson's tutorial here.
      From the looks of it, you might need to set up a loop or two. It's tough to say without a sample of your data.

      Comment


      • #4
        Thank you! So I have panels of combination of countries (pairs, so i.e. United States and United Kingdom), with yearly data per pair and a bilateral exchange rate. I have about 20.000 of these pairs, so 20.000 panels. This means I have 20.000 different timeseries. I want STATA to treat all these time series as separate time series and apply GARCH to this series, to obtain the conditional variances. Something to note, my panels are perfectly balanced (so there are exactly 34 observations per pair).

        Code:
        * Example generated by -dataex-. To install: ssc install dataex
        clear
        year Countyr1,      Country2         pairid   BilER
        1980 "United States" "United Kingdom"     1 2.323987037553276
        1981 "United States" "United Kingdom"     1 2.009479388119181
        1982 "United States" "United Kingdom"     1 1.746887122990847
        1983 "United States" "United Kingdom"     1  1.51578405107048
        1984 "United States" "United Kingdom"     1 1.330129200620134
        1985 "United States" "United Kingdom"     1 1.283291797867198
        1986 "United States" "United Kingdom"     1  1.46585152292201
        1987 "United States" "United Kingdom"     1 1.634183190301449
        1988 "United States" "United Kingdom"     1 1.778820825604109
        1989 "United States" "United Kingdom"     1 1.636198603422682
        1990 "United States" "United Kingdom"     1 1.775640170070815
        1991 "United States" "United Kingdom"     1 1.763620736887775
        1992 "United States" "United Kingdom"     1 1.755081326080946
        1993 "United States" "United Kingdom"     1 1.499797782265018
        1994 "United States" "United Kingdom"     1 1.530393763619906
        1995 "United States" "United Kingdom"     1 1.578113161192329
        1996 "United States" "United Kingdom"     1 1.560164007872868
        1997 "United States" "United Kingdom"     1 1.637100323323629
        1998 "United States" "United Kingdom"     1 1.656112827754095
        1999 "United States" "United Kingdom"     1 1.617974152523139
        2000 "United States" "United Kingdom"     1 1.513017625394488
        2001 "United States" "United Kingdom"     1  1.43956352433942
        2002 "United States" "United Kingdom"     1 1.498748544964955
        2003 "United States" "United Kingdom"     1 1.632726367306287
        2004 "United States" "United Kingdom"     1 1.830898238675894
        2005 "United States" "United Kingdom"     1 1.818187327840388
        2006 "United States" "United Kingdom"     1 1.839971541773486
        2007 "United States" "United Kingdom"     1 2.000913750612778
        2008 "United States" "United Kingdom"     1 1.838349346122117
        2009 "United States" "United Kingdom"     1 1.557828307805936
        2010 "United States" "United Kingdom"     1 1.545166740657417
        2011 "United States" "United Kingdom"     1 1.602202488182939
        2012 "United States" "United Kingdom"     1 1.579661569522725
        2013 "United States" "United Kingdom"     1 1.563329107650999
        1980 "United States" "Austria"            2 .0772916988770176
        1981 "United States" "Austria"            2 .0627871202726712
        1982 "United States" "Austria"            2 .0586192241774314
        1983 "United States" "Austria"            2 .0556689552818551
        1984 "United States" "Austria"            2 .0499773019770179
        1985 "United States" "Austria"            2 .0483336958364415
        1986 "United States" "Austria"            2 .0655001812171682
        1987 "United States" "Austria"            2 .0790982796124184
        1988 "United States" "Austria"            2 .0809869610992628
        1989 "United States" "Austria"            2 .0755819812556685
        1990 "United States" "Austria"            2 .0879520368225863
        1991 "United States" "Austria"            2 .0856463803698494
        1992 "United States" "Austria"            2 .0909973307449651
        1993 "United States" "Austria"            2  .085968389852893
        1994 "United States" "Austria"            2 .0875516834915585
        1995 "United States" "Austria"            2 .0991916295490215
        1996 "United States" "Austria"            2 .0944594123254892
        1997 "United States" "Austria"            2  .081938708071024
        1998 "United States" "Austria"            2 .0807815452944144
        1999 "United States" "Austria"            2 1.065385617642721
        2000 "United States" "Austria"            2 .9213186219223187
        2001 "United States" "Austria"            2 .8948465785541069
        2002 "United States" "Austria"            2 .9411307058009699
        2003 "United States" "Austria"            2 1.128624648598013
        2004 "United States" "Austria"            2 1.241673030240947
        2005 "United States" "Austria"            2 1.243595483261205
        2006 "United States" "Austria"            2  1.25448347165756
        2007 "United States" "Austria"            2  1.36866777300645
        2008 "United States" "Austria"            2 1.464826488421991
        2009 "United States" "Austria"            2 1.389191115547446
        2010 "United States" "Austria"            2 1.324424456271926
        2011 "United States" "Austria"            2  1.39013372736599
        2012 "United States" "Austria"            2 1.284862162793938
        2013 "United States" "Austria"            2 1.327740568132363
        1980 "United States" "Belgium-Luxembourg" 3 .0341977771455576
        1981 "United States" "Belgium-Luxembourg" 3 .0269329437040189
        1982 "United States" "Belgium-Luxembourg" 3  .021886347843853
        1983 "United States" "Belgium-Luxembourg" 3 .0195573519348668
        1984 "United States" "Belgium-Luxembourg" 3 .0173058535608263
        1985 "United States" "Belgium-Luxembourg" 3  .016841254336623
        1986 "United States" "Belgium-Luxembourg" 3 .0223854285783574
        1987 "United States" "Belgium-Luxembourg" 3 .0267851761906569
        1988 "United States" "Belgium-Luxembourg" 3 .0271973165314356
        1989 "United States" "Belgium-Luxembourg" 3 .0253781341995736
        1990 "United States" "Belgium-Luxembourg" 3 .0299240676782664
        1991 "United States" "Belgium-Luxembourg" 3 .0292840775149532
        1992 "United States" "Belgium-Luxembourg" 3  .031104682809997
        1993 "United States" "Belgium-Luxembourg" 3 .0289046405798271
        1994 "United States" "Belgium-Luxembourg" 3 .0298895603163481
        1995 "United States" "Belgium-Luxembourg" 3 .0339212834004503
        1996 "United States" "Belgium-Luxembourg" 3 .0322981628589645
        1997 "United States" "Belgium-Luxembourg" 3 .0279533474471338
        1998 "United States" "Belgium-Luxembourg" 3 .0275492408818154
        1999 "United States" "Belgium-Luxembourg" 3 1.065385617642721
        2000 "United States" "Belgium-Luxembourg" 3 .9213186219223187
        2001 "United States" "Belgium-Luxembourg" 3 .8948465785541069
        2002 "United States" "Belgium-Luxembourg" 3 .9411307058009699
        2003 "United States" "Belgium-Luxembourg" 3 1.128624648598013
        2004 "United States" "Belgium-Luxembourg" 3 1.241673030240947
        2005 "United States" "Belgium-Luxembourg" 3 1.243595483261205
        2006 "United States" "Belgium-Luxembourg" 3  1.25448347165756
        2007 "United States" "Belgium-Luxembourg" 3  1.36866777300645
        2008 "United States" "Belgium-Luxembourg" 3 1.464826488421991
        2009 "United States" "Belgium-Luxembourg" 3 1.389191115547446
        2010 "United States" "Belgium-Luxembourg" 3 1.324424456271926
        2011 "United States" "Belgium-Luxembourg" 3  1.39013372736599
        end

        Comment


        • #5
          gen 1Vol = Vol1 + Vol2
          This is an illegal variable name and won't work. I'm not sure of this part, but here's some code you can build off. I'm also not very knowledgable on how to speed up estimation, but hopefully, you can build off of this code to get the results you want.

          Code:
          local last_id = pairid[_N]
          
          tempfile temp
          save `temp'
          
          forvalues x = 1/`last_id' {
              di "Processing `x'"
          
              quietly {
              use `temp' , clear
              keep if pairid == `x'
              tsset year
              cap arch BilER if pairid==`x' , arch(1) garch(1)
              if _rc == 430 {
                  * if MLE doesn't converge - no predict
              }
              else {
                  predict vol, variance
              }
              tempfile _`x'
              save `_`x''
              
              }
          }
          clear
          forvalues x = 1/`last_id' {
              append using `_`x''
          }
          Last edited by Justin Niakamal; 02 Feb 2019, 11:48.

          Comment


          • #6
            Dear Justin,

            Thank you very much for your help! I have examined the code and it looks like it could work. However, when I run it, I get the return 'option variance not allowed'. I believe this might be the result of the series not converging, and therefore no predict being possible. I therefore wonder what the value of 430 resembles, should that value be adjusted?

            Comment


            • #7
              The return code 430 suggests the maximization procedure failed to converge to a solution. The loop will skip the predict for a given pairid if the ML doesn't converge. So it could be something else in the data structure that I can't see given the data you've posted. Can you run the code, keep the pairid where the code breaks and post it here using -dataex-. You could just put a capture in front of the predict and change if _rc == 430 { to if _rc { ...but I don't recommend doing that. It's important to know what's causing the problem.

              Comment

              Working...
              X